US20100266171A1 - Image formation apparatus and method for nuclear imaging - Google Patents
Image formation apparatus and method for nuclear imaging Download PDFInfo
- Publication number
- US20100266171A1 US20100266171A1 US12/601,800 US60180008A US2010266171A1 US 20100266171 A1 US20100266171 A1 US 20100266171A1 US 60180008 A US60180008 A US 60180008A US 2010266171 A1 US2010266171 A1 US 2010266171A1
- Authority
- US
- United States
- Prior art keywords
- detector
- data
- image
- radiation
- image generation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims description 164
- 238000012633 nuclear imaging Methods 0.000 title description 52
- 230000015572 biosynthetic process Effects 0.000 title 1
- 230000005855 radiation Effects 0.000 claims abstract description 287
- 238000011156 evaluation Methods 0.000 claims abstract description 163
- 238000001514 detection method Methods 0.000 claims abstract description 149
- 230000000007 visual effect Effects 0.000 claims description 17
- 230000029058 respiratory gaseous exchange Effects 0.000 claims description 15
- 230000001360 synchronised effect Effects 0.000 claims description 15
- 230000001121 heart beat frequency Effects 0.000 claims description 5
- 238000004393 prognosis Methods 0.000 claims 2
- 238000012800 visualization Methods 0.000 description 58
- 238000002675 image-guided surgery Methods 0.000 description 43
- 238000005259 measurement Methods 0.000 description 43
- 238000012545 processing Methods 0.000 description 40
- 239000011159 matrix material Substances 0.000 description 39
- 230000003287 optical effect Effects 0.000 description 39
- 238000003384 imaging method Methods 0.000 description 33
- 238000009826 distribution Methods 0.000 description 31
- 239000013598 vector Substances 0.000 description 30
- 239000000463 material Substances 0.000 description 27
- 238000003908 quality control method Methods 0.000 description 25
- 230000002285 radioactive effect Effects 0.000 description 24
- 230000008569 process Effects 0.000 description 20
- 230000033001 locomotion Effects 0.000 description 18
- 230000006870 function Effects 0.000 description 17
- 238000009825 accumulation Methods 0.000 description 16
- 239000000523 sample Substances 0.000 description 15
- 238000004088 simulation Methods 0.000 description 12
- 230000003750 conditioning effect Effects 0.000 description 10
- 238000003325 tomography Methods 0.000 description 10
- 230000002708 enhancing effect Effects 0.000 description 8
- 230000009471 action Effects 0.000 description 7
- 230000003190 augmentative effect Effects 0.000 description 7
- 238000004422 calculation algorithm Methods 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 7
- 238000012544 monitoring process Methods 0.000 description 7
- 238000012986 modification Methods 0.000 description 6
- 230000004048 modification Effects 0.000 description 6
- 238000001356 surgical procedure Methods 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 5
- 238000002591 computed tomography Methods 0.000 description 5
- 238000013500 data storage Methods 0.000 description 5
- 210000001519 tissue Anatomy 0.000 description 5
- 238000000354 decomposition reaction Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 238000002073 fluorescence micrograph Methods 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 238000013480 data collection Methods 0.000 description 3
- 238000011157 data evaluation Methods 0.000 description 3
- 230000006872 improvement Effects 0.000 description 3
- 238000013507 mapping Methods 0.000 description 3
- 238000002600 positron emission tomography Methods 0.000 description 3
- 238000002603 single-photon emission computed tomography Methods 0.000 description 3
- 239000007787 solid Substances 0.000 description 3
- 238000007476 Maximum Likelihood Methods 0.000 description 2
- 210000003484 anatomy Anatomy 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 230000005672 electromagnetic field Effects 0.000 description 2
- 230000009545 invasion Effects 0.000 description 2
- 210000003205 muscle Anatomy 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 239000000941 radioactive substance Substances 0.000 description 2
- 230000035807 sensation Effects 0.000 description 2
- 230000000638 stimulation Effects 0.000 description 2
- 238000012897 Levenberg–Marquardt algorithm Methods 0.000 description 1
- 206010028980 Neoplasm Diseases 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 238000012804 iterative process Methods 0.000 description 1
- 229940050561 matrix product Drugs 0.000 description 1
- 230000004060 metabolic process Effects 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 230000037081 physical activity Effects 0.000 description 1
- 230000005258 radioactive decay Effects 0.000 description 1
- 239000000700 radioactive tracer Substances 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- 238000011895 specific detection Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
- 238000013179 statistical model Methods 0.000 description 1
- 210000002784 stomach Anatomy 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5205—Devices using data or image processing specially adapted for radiation diagnosis involving processing of raw data to produce diagnostic data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
- A61B6/037—Emission tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/42—Arrangements for detecting radiation specially adapted for radiation diagnosis
- A61B6/4208—Arrangements for detecting radiation specially adapted for radiation diagnosis characterised by using a particular type of detector
- A61B6/4258—Arrangements for detecting radiation specially adapted for radiation diagnosis characterised by using a particular type of detector for detecting non x-ray radiation, e.g. gamma radiation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01T—MEASUREMENT OF NUCLEAR OR X-RADIATION
- G01T1/00—Measuring X-radiation, gamma radiation, corpuscular radiation, or cosmic radiation
- G01T1/16—Measuring radiation intensity
- G01T1/161—Applications in the field of nuclear medicine, e.g. in vivo counting
Definitions
- the present invention relates to image generating apparatuses and methods for image generation with image generating apparatuses.
- Specific embodiments of the invention relate to image generating apparatuses for enhanced image generation by means of quality control, instruction to a user for data collection and/or a continuous data collection with enhanced processing.
- Typical embodiments of the present invention relate to image generating apparatuses and methods for medical purposes.
- High quality image generation is of great interest for a vast area of applications.
- the best possible image generation is necessary, for example as a basis for surgery on the patient.
- medical images are generated either pre-operatively or intra-operatively.
- a registration of images is known, for example the registration of an anatomical image with a functional image, i.e., an image that visualizes body activity.
- Such registered images can for example help in tumor surgeries to decide which body tissue is to be cut out. Images that are as up-to-date and of as high quality as possible are desirable because in this way it can be avoided to damage healthy tissue or not to remove deceased tissue.
- an image generating apparatus for image generation.
- the image generating apparatus includes a movable detector for detecting nuclear radiation during a detection period.
- the image generating apparatus further includes an evaluation system.
- the evaluation system includes an interface system for transmitting detector data to the evaluation system.
- the detector data include information about the detected nuclear radiation for image generation.
- the evaluation system further includes a data memory portion for storing the detector data.
- the evaluation system further includes a program memory portion with a program for repeatedly determining at least one quality value with respect to image generation from the detector data during the detection period.
- an image generating apparatus for image generation includes a freely movable detector for detecting radiation during a detection period.
- the image generating apparatus further includes an evaluation system.
- the evaluation system includes an interface system for continuously transmitting detector data to the evaluation system during the detection period.
- the detector data include information about the detected radiation and information about the position and/or orientation of the detector for image generation.
- the evaluation system further includes a data memory portion for storing the detector data and a program memory portion with a program for determining at least one quality value with respect to the image generation from the detector data.
- an image generating apparatus for image generation.
- the image generating apparatus includes a freely movable detector for detection of radiation during a detection period.
- the image generating apparatus further includes an evaluation system.
- the evaluation system includes an interface system for continuously transmitting detector data for image generation during a detection period.
- the detector data include information about the detected radiation.
- the detector data further include information about the position and/or orientation of the detector.
- the evaluation system further includes a data memory portion for storing detector data.
- the evaluation system further includes a program memory portion with a program for determining at least one quality value with respect to image generation from the detector data.
- an image generating apparatus for image generation.
- the image generating apparatus includes a movable detector for detecting radiation.
- the image generating apparatus further includes an evaluation system.
- the evaluation system includes an interface system for transmitting detector data for image generation to the evaluation system.
- the detector data include information about the detected radiation.
- the detector data further include information about the position and/or orientation of the detector.
- the evaluation system further includes a data memory portion for storing detector data.
- the evaluation system further includes a program memory portion with a program for determining an image generation rule for image generation on the basis of the collected detector data, taking into account a detection model.
- the detection model takes into account a material property of a material influencing the detection and/or a constraint.
- an image generating apparatus for image generation.
- the image generating apparatus includes a movable detector for detection of radiation.
- the image generating apparatus further includes an evaluation system.
- the evaluation system includes an interface system for transmitting detector data for image generation to the evaluation system.
- the evaluation system further comprises a program memory portion with a program for registering detector data with compatible data.
- an image generating apparatus for image generation.
- the image generating apparatus includes a movable detector for detection of nuclear radiation during a detection period.
- the image generating apparatus further includes an evaluation system.
- the evaluation system includes an interface system for transmitting detector data for image generation to the evaluation system.
- the detector data include information about the detected nuclear radiation.
- the evaluation system further includes a data memory portion for storing detector data.
- the evaluation system further includes a program memory portion with a program for determining an image generation rule on the basis of the collected detector data.
- the evaluation system further includes a program memory portion with a program for repeatedly modifying the image generation rule on the basis of at least one quality value during the detection period.
- a method for image generation by means of an image generating apparatus includes detecting nuclear radiation by means of a movable detector during a detection period.
- the method further includes collecting detector data for image generation by means of an evaluation system of the image generating apparatus.
- the detector data include information about the detected radiation.
- the method further includes repeatedly determining at least one quality value from the collected detector data by means of the evaluation system during the detection period and outputting an instruction to a user for further moving the detector in dependence of the collected detector data and/or of the at least one determined quality value, wherein the instruction relates to at least a part of the remaining detection period.
- a method for image generation by means of an image generating apparatus includes detecting radiation by means of a freely movable detector of the image generating apparatus during a detection period, and changing position and/or orientation of the detector during the detection period.
- the method further includes continuously collecting detector data for image generation by means of an evaluation system of the image generating apparatus during the detection period.
- the detector data include information about the detected radiation and information about the position and/or orientation of the detector.
- the method further includes determining at least one quality value from the collected detector data by means of the evaluation system.
- a method for image generation by means of an image generating apparatus includes detecting radiation by means of a movable detector of the image generating apparatus during a detection period. The method further includes changing the position and/or orientation of the detector during the detection period. The method further includes continuously collecting detector data for image generation by means of an evaluation system of the image generating apparatus during the detection period. The detector data include information about the detected radiation. The detector data further include information about the position and/or orientation of the detector. The method further includes determining at least one quality value from the collected detector data by means of the evaluation system.
- a method for image generation by means of an image generating apparatus includes detecting radiation by means of a detector of the image generating apparatus.
- the method further includes collecting detector data for image generation by means of an evaluation system of the image generating apparatus.
- the detector data include information about the detected radiation.
- the detector data further include information about the position and/or orientation of the detector.
- the method further includes determining an image generation rule by means of the evaluation system for image generation on the basis of the collected detector data, taking into account a detection model.
- the detection model takes into account a material property of a material influencing the detection and/or a constraint.
- a method for image generation by means of an image generating apparatus includes detecting radiation by means of a detector of the image generating apparatus.
- the method further includes collecting detector data of the detector for image generation by means of the evaluation system of the image generating apparatus.
- the method further includes registering the detector data with compatible data by means of the evaluation system.
- a method for image generation by means of an image generating apparatus includes detecting nuclear radiation by a movable detector of the image generating apparatus during a detection period.
- the method further includes collecting detector data for image generation by means of an evaluation system of the image generating apparatus.
- the detector data include information about the detected radiation.
- the method further includes determining an image generation rule by means of the evaluation system on the basis of the collected detector data.
- the method further includes repeatedly modifying the image generation rule on the basis of at least one quality value during the detection period.
- FIG. 1 shows a schematic arrangement of an image generating apparatus according to embodiments of the invention
- FIG. 2 shows a detector system of the image generating apparatus according to embodiments of the invention
- FIG. 3 shows a detection system of the image generating apparatus according to embodiments of the invention
- FIG. 4 shows a schematic arrangement of an evaluation system of the image generating apparatus according to embodiments of the invention
- FIG. 5 shows a schematic arrangement of program memory portions of the evaluation system according to embodiments of the invention.
- FIG. 6 shows an output system of the image generating apparatus according to embodiments of the invention
- FIG. 7 shows a further output system of the image generating apparatus according to embodiments of the invention.
- FIG. 8 shows a guiding system of the image generating apparatus according to embodiments of the invention.
- FIG. 9 shows an image generating apparatus according to embodiments of the invention at use in the medical field
- FIG. 10 shows the generation of a detection model according to embodiments of the invention.
- FIG. 11 shows the generation of a detection model via measurements according to embodiments of the invention.
- FIG. 12 shows a quality control process according to embodiments of the invention
- FIG. 13 shows an iterative flow diagram with a step of instructing a user according to embodiments of the invention
- FIG. 14 shows a detection process with a freely movable detector according to embodiments of the invention.
- detection period denotes a period between the beginning of a first detection process and the end of last detection process.
- the first and last detection process can be identical such that the detection period is a period during which a detection process continuously takes place.
- the first and last detection can also be different.
- other processes can therefore lie.
- such other processes can be data evaluation processes.
- the at least one detection process taking place in the detection period is carried out by the same detector, respectively detector system, on the same object.
- An example for a detection period is the period between the first measurement of nuclear radiation with a gamma probe on a patient and the last measurement, wherein for example after the last measurement a final image with the visualization of body functions can be generated.
- a detection period would for example not be defined by a first measurement only on the back of a patient and by a further measurement only on the stomach of the patient.
- Specifying that an action is carried out “during a detection period” (or more generally during any period) is not to be construed in the sense that the action needs to fill the full period.
- the action can also only take place during part of the detection period.
- the action can also be interrupted.
- freely movable is generally understood in that the position and/or orientation of an object which is freely movable can be changed substantially arbitrarily.
- a detector which is handheld is a freely movable detector.
- a detector which is mounted to a robot arm with sufficiently many degrees of freedom is freely movable, wherein the robot arm is for example controlled by a user.
- a detector which is movable along a rail is movable but not freely movable.
- the expression “continuous” includes, when relating to an action such as “continuously collecting detector data”, an ongoing or regularly repeated action.
- the temporal distances between the regular repetitions can in principally in principal be arbitrarily short, i.e. quasi-continuous.
- physical constraints can limit arbitrarily short distances.
- detectors can have so-called “dead times” such that during such dead times no detection can take place. Consequently, also during e.g. a continuous collection of the detector data, a regular repetition of data collection within the collection process may not be possible within time intervals that are shorter than said dead times.
- the “generation of an image” includes the generation of image data without the need for output of such image data to an output unit, for example a monitor.
- FIG. 1 shows an image generating apparatus 1 according to embodiments of the invention.
- the image generating apparatus 1 includes a detector system 100 .
- the detector system 100 includes at least one detector 110 .
- the image generating apparatus further includes an evaluation system 300 .
- the evaluation system 300 includes at least one memory unit 310 and at least one processing unit 350 .
- the detector system and the evaluation system are linked by a data exchange system 20 .
- the image generating apparatus includes a tracking system 200 as shown in FIG. 1 .
- the tracking system 200 includes at least one tracking unit 210 .
- the image generating apparatus includes an output system 400 .
- the output system includes at least one output unit 410 .
- the tracking system 200 and the output system 400 are connected to the evaluation system 300 by means of a data exchange system.
- the image generating apparatus includes a guiding system 500 .
- the guiding system 500 includes at least one guiding unit 510 .
- the guiding system can be connected to the evaluation system by means of a data exchange system. The individual systems are described in more detail in the following.
- the detector system 100 includes a detector 110 .
- the detector 110 is a radiation detector, typically a detector for nuclear radiation.
- the detector is movable, according to specific embodiments even freely movable.
- the detector is handheld.
- the detector can be a gamma radiation probe, a beta radiation probe, a Compton probe, a gamma radiation camera, a gamma radiation mini camera, a beta radiation camera or a Compton camera.
- the detector can also be a detector for optical radiation, a detector for infrared radiation, x-rays or a detector for other kinds of radiation or any other kind of detector.
- Detector data can include information about the detected radiation.
- the detector data can be formatted to a certain degree but generally the association of single data sets to specific detection events or at least to a group of detection events should be possible.
- the detector data can also include position and/or orientation of the detector.
- Detector data can further include other data.
- the detector system 100 includes at least one further detector.
- the at least one further detector can be similar to the detector 110 or identical in built.
- the at least one further detector can also be of a different kind as compared to detector 110 .
- the at least one further detector can, for example, be an ultrasonic probe, an x-ray detector, an optical camera, an optical microscope, a fluorescence camera, an auto-fluorescence camera, a magnetic resonance tomography detector, a positron emission tomography detector, short PET, a single photon emission computer tomography detector, short SPECT, or another kind of detector.
- FIG. 2 shows a detector system 100 according to embodiments of the present invention.
- two detectors 110 , 120 of the detector system 100 are shown: a probe 110 for detecting nuclear radiation and an optical camera 120 .
- the nuclear radiation can for example be gamma, beta, Compton, x-ray, or alpha radiation.
- a nuclear radiation source 10 that is to be detected is shown.
- a radiation source can generally be, here and in the following, a spatially distributed radiation source, i.e. a spatial radiation distribution.
- a radiation source can also be a substantially two dimensional radiation source, i.e. a radiation distribution that is substantially plane.
- the detectors can be handheld as shown and be movable and orientable in the three spatial directions, i.e. freely movable. Further, the detectors 110 , 120 each have a cable for power supply and for data exchange, e.g. with the evaluation system 300 shown in FIG. 1 . Further, the detectors 110 , 120 each have markings for tracking by the tracking system 200 shown in FIG. 3 , as further described below with respect to FIG. 3 . There can also be a tracking system 200 that works without markings.
- Detector data such as detector data with information about measured radiation, can be provided to the evaluation system 300 (see FIG. 1 ).
- the evaluation system 300 can collect the detector data.
- the image generating apparatus includes a tracking system 200 .
- the tracking system 200 includes a tracking unit 210 .
- the tracking unit can be an optical, electromagnetic, mechanical, robot-based, radio wave-based, sound wave-based, goniometer-based, potentiometer-based, gyroscope-based, acceleration sensor-based, radiation-based, or x-ray-based detection unit, or an infrared or white light detection unit or another kind of detection or tracking unit.
- the tracking system 200 includes a further tracking unit 220 or further tracking units.
- the tracking unit 220 or the further tracking units can be tracking units as the ones listed above or can be other tracking units. To guarantee feasibility or reliability of the tracking system, some embodiments have at least two, at least three or at least four tracking units.
- FIG. 3 shows a tracking system 200 according to typical embodiments of the present invention.
- FIG. 3 shows two optical tracking units 210 and 220 .
- the optical tracking units 210 and 220 detect markings 112 on the probe of nuclear radiation 110 and markings 122 on the optical camera 120 .
- the optical tracking units 210 and 220 generate, by detecting the markings 112 and 122 , data with information about the position and/or orientation of the probe 110 and the camera 120 .
- the optical tracking units 210 and 220 are exactly calibrated, and the position and orientation of probe 110 and of the camera 120 is being determined by detecting the position of the markings 112 , and 122 respectively, by means of known triangulation methods.
- Data of the tracking systems can be provided to the evaluation system 300 .
- the evaluation system 300 can collect such and other detector data.
- the evaluation system 300 includes a memory system 302 with a memory unit 310 .
- the memory unit 310 can for example be a computer hard drive or another mass storage device, or can be of a different kind.
- the storage unit 310 includes a data storage portion 320 .
- the data storage portion 320 can for example be used for storing detector data.
- the data storage portion 320 can also be used for storing other data.
- the storage unit 310 includes a program storage portion 330 .
- the program storage portion 330 as well as further program storage portions according to further embodiments will be described further below.
- the data storage unit 310 can include further data storage portions and further program storage portions.
- the different storage portions need not physically or in a memory-technical sense form a unit; different portions are rather distinguished only with respect to the nature of the data stored or to be stored therein.
- the memory system 302 can include further memory units.
- the further memory units can be similar to memory unit 410 or of a different kind.
- the evaluation system 300 includes a processing system 304 .
- the processing system 304 includes a processing unit 350 according to some embodiments.
- the processing unit 350 can for example be the computing part of a computer, for example a processor.
- the processing system 304 includes further processing units, which can be similar to the processing unit 350 or be of a different kind.
- at least one processing unit and at least one memory unit can be integrated in special devices, such as commercially available computers.
- the evaluation system includes an interface system 306 .
- the interface system 306 includes a detector system interface 306 a with a detector interface 380 for data exchange with a detector, for example with the detector 110 .
- the interface system 306 includes a tracking system interface 306 b with a tracking unit interface 390 for data exchange with a tracking system (for example, the tracking system 200 of FIG. 3 ).
- An interface system 306 or parts thereof can also be integrated in special devices, such as commercially available computers.
- the evaluation system communicates with other partial systems of the image generating apparatus via such interface systems by means of a data exchange system.
- the program memory portion 330 includes a program. As shown in FIG. 5 , the program can for example be a program 330 for determining at least one quality value on the basis of detector data.
- a memory unit includes further program memory portions, for example further program memory portions 332 and 334 with program 332 a for determining an image generation rule on the basis of detector data while taking into account a detection model, and respectively with a program 334 a.
- Program 334 a includes program part 334 b for determining at least one quality value on the basis of detector data and program part 334 c for repeatedly determining at least one quality value on the basis of detector data.
- programs which for example carry out similar functions, can also be formed as program parts of a single program, as for example described above for program 334 a.
- the same is also true for functionally different programs.
- the first program portion with a first program and a second program portion with a second program are identical, and the first and second program are considered as parts of a single program.
- the first program portion in which a first program portion with a first program and a second program portion with a second program are provided, the first program portion can be identical to the second program portion as well as the first program to the second.
- the image generating apparatus includes an output system 400 according to further embodiments.
- the output system 400 includes an output unit 410 according to some embodiments.
- the output unit 410 can be a visual, acoustical or haptic output unit or a combination form thereof.
- the output unit 410 is an output unit for displaying images or an instruction to a user.
- a user is usually a human being.
- a user can also be a different living being or an inanimate object, for example a machine.
- the output system 400 includes further output units. These can be of similar kind as the output unit 410 or of a different kind.
- Output units can display reality, display a virtual reality or display an augmented reality.
- An output unit of an augmented reality can for example combine a real image with virtual images.
- an output unit can, among others, be one of the following: monitor, optically transparent monitor, stereo monitor, head-mounted stereo displays, acoustical frequency-coded feedback systems, acoustical pulse-coded feedback systems, force-feedback joysticks or force-torque-feedback joysticks or other kinds of visual, acoustical and/or haptic output units or combinations thereof.
- FIG. 6 shows an output unit 410 according to embodiments of the present invention.
- the output unit 410 is an optical output unit, in particular a monitor.
- FIG. 6 shows further an acoustical output unit 420 .
- the acoustical output unit is a loudspeaker.
- FIG. 7 shows a further output unit 430 in form of a head-mounted visual display.
- the image generating apparatus includes a guiding system 500 , as for example shown in FIG. 8 .
- the guidance system 500 includes a guiding unit 510 .
- a guiding unit 510 can for example guide an object by means of a robot arm.
- the guiding unit 510 can also guide a user. Guiding can also be robot-based or else can rely upon optical, acoustical or haptic signals or on combinations thereof.
- the guiding unit 510 shown in FIG. 8 guides the user by haptic signals.
- the guiding unit 510 serves for better guiding a surgical instrument 40 during surgery on a body 30 .
- the guiding unit may for example provide a resistance, be it by mechanical hindrance or by stimulation of the muscles by means of electrical pulses.
- the guiding unit 510 can also be formed by output units of the output system if the guidance of the user is effected by a corresponding output.
- the guiding system 500 can therefore be identical with the output system 400 .
- the image generating apparatus includes a data exchange system.
- the data exchange system serves for exchanging data between systems of the image generating apparatus, for example for exchange of data between detector system and evaluation system, between tracking system and evaluation system, between output system and evaluation system, or between guiding system and evaluation system (as shown in FIG. 1 by means of corresponding connection lines).
- the data exchange system can rely upon interfaces such as the detector interface 380 or the tracking system interface 390 , according to some embodiments.
- the exchange of data can take place by a connection of the systems by means of cables or else wireless or in any other way.
- FIG. 9 shows, according to embodiments of the invention, a body part of a human or other living being, into which radioactive substances have been injected, so-called tracers, which accumulate in certain preferred regions and are stuck there.
- the regions or spatial areas in which the radioactive substances are accumulated, respectively are stuck, can be regarded as closed regions which include a source of nuclear radiation.
- FIG. 9 further shows a detector for nuclear radiation 110 .
- the detector 110 measures the nuclear radiation that emerges from the source within the body.
- FIG. 9 shows a laparoscope 120 which gathers data for generating an optical image of the interior of the body.
- the data gathered by the detector for nuclear radiation 110 and the laparoscope 120 are collected in the evaluation system (not shown) and are processed. Further, the position and/or orientations of the two detectors are tracked via markings 112 and 122 , and corresponding data is collected by the evaluation system.
- the evaluation system From all these data, the evaluation system generates, with the help of an image generation rule, an optical image of the interior of the body based on data of the laparoscope as well as a functional image, that visualizes body functions such as metabolism, based on the data of the detector for nuclear radiation.
- the image can in particular be three dimensional.
- the optical anatomic image and the functional image are overlaid and, for example, displayed three dimensionally.
- the overlay is generated on the basis of a registration of the optical image with the anatomic image by means of the evaluation system.
- FIG. 9 shows a surgical instrument 40 the position and/or orientation of which are also tracked.
- the gathered data of the surgical instrument are also processed by the evaluation system.
- an image of the surgical instruments and of their location in the interior of the body can be determined by the evaluation unit.
- This image can also be registered with the anatomic and optical image and be displayed on the output unit 410 . If, in particular, the functional image is high quality and up-to-date and if the registration with the optical image and the instrument image is good, the output of the registered images on the output unit enables a surgeon to precisely control the surgery.
- an enhancement of registration of the images can contribute to enhance image generation. Changing an enhancing the imaging rule already during the detection period can also enhance image generation overall.
- means for enhancing image generation are provided.
- detector data are collected by the evaluation system.
- position and/or orientation of the detector can have been tracked by a tracking system.
- the detector data include information about the detected nuclear radiation, according to some embodiments.
- the detector data include information about the position and/or orientation of the detector.
- data with information about the detected radiation can be synchronized with data about the position and/or orientation of the detector and be collected in synchronized form.
- synchronization of data see WO 2007/131561, in particular page 3, lines 1 to 6 and lines 27 to 32, and page 6, lines 22 to 30, in corporate herein by reference.
- the WO 2007/131561 is further included herein by reference in its entirety.
- the detector data are stored in the evaluation system.
- a detector detects radiation during a detection period.
- This radiation can be radioactive, respectively nuclear radiation.
- Nuclear radiation is also to be understood as radiation which is indirectly generated by radioactive decays, for example ionization radiation of an alpha particle.
- Embodiments of the invention in which the detector measures nuclear radiation hence also include detecting of such secondary radiation.
- the evaluation system generates an image from the detector data by means of an image generation rule.
- this image is an image of the radiation distribution and thus of the radiation sources in a spatial region.
- the image generation rule is a linear rule.
- an imaging matrix H also called system matrix
- the vector f contains image information.
- this spatial region is divided into image elements (voxel).
- Information elements with respect to these image elements form the entries f i of the vector f to a corresponding index i.
- the entries H ki of the imaging matrix H model the influence of a normalized radiation source at the position belonging to the index i onto the k th measurement.
- matrix notation with “*” as matrix product
- Such a vector g_predicted can be compared to a vector g_measured which contains the actual detector data with information about the detected radiation.
- different measurement errors for example contributions of external radiation sources, imperfections of the detector, statistical errors, etc. are to be taken into account.
- the image information of which is coded into a respective vector f.
- denotes a suitable distance norm.
- is computed as the L 2 norm.
- This minimization can also be implemented as an iterative process. The involved minimization process can be carried out for example by algebraic reconstruction techniques, maximum likelihood expectation value maximization, pseudo inversion by means of singular value decomposition, Gauss-Seidel inversion, successive over-relaxation, Jacobi inversion, multiplicative algebraic reconstruction techniques, simultaneous iterative reconstruction techniques or by other techniques. Also, regularization methods such as Tikhonov regularization, total variation regularization and other regularizations can be used. In light of this, the image generation rule is defined, in the first line, by the matrix H. But, also the algorithm to be used for solving the minimization problem as well as the starting vector to be used in an iterative solution are part of the image generation rule.
- the image generation rule is non-linear. Also for such a non-linear image generation rules, analogous methods can be applied.
- image generation rules in particular the matrix H described above can be generated or enhanced on the basis of at least one detection model.
- Detection models can be changed or adapted, in particular on the basis of new detector data.
- detection models can be enhanced or be continuously enhanced. Enhanced or continuously enhanced detection models can be used for enhancing an image generation rule.
- the entries of the imaging matrix can be calculated with the help of detection models.
- detection models can be generated by algebraic, analytic, numeric, or statistical methods, or on the basis of measurement data or by combinations thereof.
- detection models are generated by measurements on a radioactive point source which is positioned differently and the radiation of which is measured from different positions and orientations. By such measurements or by suitable detection models, information is gained about for example at least one material property of at least one material, or such information is used.
- material properties of materials distributed in space can be determined, such as operation table, instruments, but also the patient himself.
- Material properties include the attenuation between source and detector, the scattering between source and detector, the material properties of materials between source and detector, the attenuation by a detector shield or the scattering by a detector shield, the attenuation in the detector itself and the scattering in the detector itself.
- analytic, algebraic, numerical, or statistical models, or models that are combinations thereof can also take into account constraints besides material properties. Examples for constraints are the relative solid angle between a detector and a source area of radiation, the dimensions of the detector or the absence of material or matter. Constraints allow to exclude certain image vectors f from the start, and to thereby obtain better results of the optimization problems described above.
- FIG. 10 schematically shows the mapping of real objects and of a real detection process onto a detection model and a simulated detection process.
- real objects such as a detector 110 , a body 30 and a source of radiation 10 within the body are mapped to data of a detection model.
- data with respect to a detector describe a virtual detector 110 a
- data with respect to the body describe a virtual body 30 a
- data with respect to the radiation source describe a virtual radiation source 10 a.
- FIG. 11 illustrates the determination of a detection model on the basis of measurements.
- a radioactive point source 50 emits nuclear radiation 52 in all spatial directions.
- a detector 110 measures the radiation source 50 at different positions and with different orientations (second position/orientation is depicted with dashed lines), whereby information about material properties are gained. Material properties can for example include those of the body 30 .
- a detection model can be determined. The detection model can take into account the information of the measurement data and further information, such as for example the detector geometry.
- a method for image generation by means of an image generating apparatus includes detecting radiation by means of a detector of the image generating apparatus.
- the radiation may be nuclear radiation. Detecting can take place during a detection period.
- the method further includes collecting detector data for image generation by means of an evaluation system of the image generating apparatus.
- the detector data include information about the detected radiation.
- the detector data include information about the position and/or orientation of the detector.
- the method further includes determining an image generation rule by means of the evaluation system for image generation on the basis of the collected detector data, taking into account a detection model.
- the detection model takes into account a material property of a material influencing the detection and/or of a constraint.
- the detector is movable. According to further embodiments, the detector is freely movable. In further embodiments, the detector is handheld. In typical embodiments, the method includes again, repeatedly, or continuously collecting detector data for image generation by means of the evaluation system of the image generating apparatus, typically during a detection period.
- the method further includes determining at least one quality value from the collected detector data by means of the evaluation system. In further embodiments, the method includes again or repeatedly determining at least one quality value form the collected detector data by means of the evaluation system. Typically, determining, again determining, repeatedly determining, or continuously determining takes place during a detection period.
- the detection model according to embodiments of the invention can be generated algebraically, analytically, numerically, statistically, or on the basis of measurement data, or by combinations thereof.
- the detection model takes into account at least one further material property and/or at least one further constraint.
- Material properties can for example influence the detection model because of the following effects: attenuation of radiation, scattering of radiation, refraction of radiation, diffraction of radiation, influence of electromagnetic fields, influence of background radiation, signal noise, or influence of errors in the measurement values of the detector as well as in measurements of position and/or orientation of the detector.
- Embodiments of the invention can include detection models that take into account these and other effects.
- Methods for image generation according to embodiments of the invention can also take into account at least one constraint, wherein the constraints can for example be the relative solid angle between the detector and the source region of radiation, the dimensions of the detector or the absence of material.
- an image generating apparatus for image generation includes a detector for detection of radiation.
- the detector can be a movable detector.
- the detector can be a freely movable detector.
- the detector can be a handheld detector.
- the radiation can be nuclear radiation.
- the image generating apparatus further includes an evaluation system.
- the evaluation system includes an interface system for transmitting detector data for image generation to the evaluation system.
- detector data include data with information about the detected radiation.
- the detector data also include data with information about the position and/or orientation of the detector for image generation.
- the evaluation system further includes a data memory portion for storing detector data.
- the evaluation system further includes a program memory portion with a program for determining an image generation rule for image generation on the basis of the collected detector data, taking into account a detection model.
- the detection model takes into account at least one material property of at least one material influencing the detection and/or at least one constraint.
- the interface system is an interface system for transmitting detector data to the evaluation system.
- the detector data typically include information about the detected radiation.
- the data also include information about the position and/or orientation of the detector.
- the interface system is an interface system for continuously transmitting detector data to the evaluation system for image generation.
- the detector data can again include information about the detected radiation and/or information about the position and/or orientation of the detector.
- the transmission is a transmission during the detection period.
- the detection model takes into account an attenuation of radiation, a scattering of radiation, a refraction of radiation, a diffraction of radiation, the influence of electromagnetic fields, the influence of background noise, a signal noise, the influence of errors in the measurement data of the detector and in the measurement of position and/or orientation of the detector or further effects.
- the detection model takes into account constraints such as the relative solid angle between the detector and a source region of radiation, the dimension of the detector or the absence of a material or combination of these constraints.
- image generation rules are modified.
- the entries of the imaging matrix or system matrix are modified.
- the system matrix is modified as soon as further measurement data are available.
- minimization of the norm of the difference between H applied to f and a g_measured can be minimized again as soon as further measurement data are available. Consequently, embodiments typically include a continuous modification of the image generation rule. Also, detection models can continuously be adapted and enhanced.
- detector data are registered with compatible data.
- a compatible data are gained by an imaging rule from the given image.
- Such an image can for example be an anatomical or body-functional image that was taken beforehand (pre-operatively taken).
- an imaging matrix H can depend on a location vector T, in which information about the relative location and/or orientation between the detector and the source of radiation is included.
- T can describe a relative location in the sense of a rigid registration or in the sense of a deformable registration.
- the information contained in g represents predicted or virtual or simulated detector data which are called simulation detector data.
- the vector g measured contains information about detected radiation.
- the format (i.e. the structure of the vector g) of the simulation detector data is compatible with the measured detector data g measured .
- a registration of detector data with such compatible data takes place, according to some embodiments of the invention, in that the distance
- can for example be given by the L 2 norm.
- the minimization takes place overall location vectors T to obtain, as a results of the minimization, an optimal location vector T. By using this optimal location vector T, an image vector is associated to the measured detector data by the matrix H(T), the image vector being compatible with the image vector of the given image and being registered.
- the minimization is carry out by algorithms such as the best-neighbour ansatz, a simplex-optimizer, the Levenberg-Marquardt algorithm, the steepest gradient decent, the conjugate gradient decent, or others.
- the registration not only can take place by comparing the detector data g as described above, but also by direct comparison of the image data f gained from the detector data with a given image. This comparison can be carried out by an image comparison with the methods described above with respect to g, or else by a comparison of single marking points designated for this purpose. Also, other registration methods are possible.
- the image comparison described above further allows obtaining an estimation of the quality of the collected data (as deviation between the image data gained from the detector data and the given image).
- Data can be indirectly registered with compatible data also.
- Indirect registration is to be understood as a registration of a first data set with a third data set by means of a second data set.
- the first data set is registered with a second data set, for example as described above.
- the second data set is registered with a third data set.
- the first data set is finally registered with a third data set.
- the first data set can have been gained from a base image such as an anatomical image taken pre-operatively.
- the second data set can for example correspond to detector data of a first instance in time, and a third data set to detector data of a later instance in time. If the registration between the first data set, derived from the base image, and the second data set has been successful, the similarity between the second and third data set, consisting of detector data, simplifies a registration if indirect registration is used as described above.
- a method for image generation by means of an image generating apparatus includes detecting radiation by means of a detector of the image generating apparatus. Detecting can take place during a detection period.
- the radiation can be nuclear radiation.
- the detector can be movable.
- the detector can be freely movable.
- the detector can be handheld.
- the method further includes collecting detector data for image generation by means of the evaluation system of the image generating apparatus. Typically, the detector data include information about the detected radiation. Typically, the detector data also include information about the position and/or orientation of the detector.
- the method further includes registering the detector data with compatible data by means of the evaluation system.
- the compatible data are detector data.
- the method for image generation includes generating simulation detector data based on a base image by means of the evaluation system.
- the compatible data can be simulation detector data.
- at least one comparison function is used for registering the detector data.
- the method includes an indirect registration of simulation detector data with detector data by means of second compatible data.
- the second compatible data are detector data.
- the second compatible data are second simulation detector data based on a second base image.
- Comparison functions can for example be cross correlation, trans-information, block entropies, cross correlation rates, cosine measure, extended Jaccard similarity, ratio image uniformity, sums of squared distances or sums of absolute values of distances, or further comparison functions.
- the base image is an anatomical or body-functional image.
- the second base image is an anatomical or body-functional image.
- Anatomical images can for example be a computer tomography, a magnetic resonance tomography, an ultrasonic image, an optical image, or an x-ray image.
- Body-functional images can for example be a positron emission tomography, short PET, a single photon emission computer tomography, short SPECT, or an optical tomography.
- an image generating apparatus for image generation includes a detector for detecting radiation.
- the detector can be movable.
- the detector can be freely movable.
- the detector can be handheld.
- the radiation can be nuclear radiation.
- the detector can be a detector for detecting during a detection period.
- the image generating apparatus further includes an evaluation system.
- the evaluation system includes an interface system for transmitting detector data for image generation to the evaluation system.
- the detector data include information about the detected radiation.
- the detector data also include information about the position and/or orientation of the detector.
- the interface system can be an interface system for continuously transmitting detector data to the evaluation system.
- the evaluation system further includes a program memory portion with a program for registering detector data with compatible data.
- the compatible data are detector data.
- the evaluation system further includes a program memory portion with a program for generating simulation detector data based on a base image.
- the compatible data are simulation detector data.
- the program for registering is programmed to register detector data with compatible data by means of at least one comparison function.
- the evaluation system further includes a program memory portion with a program for indirectly registering the simulation detector data with detector data by means of second compatible data.
- the second compatible data can be detector data.
- the second compatible data can be second simulation detector data based on a second base image.
- the comparison function can for example be comparison functions as described above or other comparison functions.
- the base image or the second base image can have the same or similar properties as the ones described above.
- Embodiments of the invention also include registering images. These images can for example be generated from detector data or from other data sets.
- a registration of images can, for example, take place by maximizing the similarity or minimizing the dissimilarity of both images.
- comparison functions can be used such as cross correlations, trans-information, block entropies, cross correlation rates, cosine measure, extended Jaccard similarity, ratio image uniformity, sums of squared distances or sums of absolute values of distances. Other information theoretic comparison functions may also be used.
- optimization algorithms can be used with algorithms as the ones mentioned above or others.
- Images can also be registered point-wise. To this end, specifically chosen points in both images are set into relation. The selection can take place automatically or interactively. Algorithms for point-wise registration can for example be the Umeyama or the Walker algorithm.
- the process includes registering a third image with a second image, registering a first image with a third image, and registering the first image with the second image using the registration of the first image with the third image.
- the images can be for example anatomical or body-functional images as in the case of the registration of data sets. Such images can be gained from detector data.
- the image can also be gained from other detectors of the detection system such as for example by means of computer tomography, magnetic resonance tomography, ultrasonic sonography, picture taking of an optical camera or of an x-ray device. Examples for organ-functional images are images gained from the detector data but also positron emission tomography, short PET, single photon emission computer tomography, short SPECT, or optical tomography.
- a method for image generation includes generating a first image on the basis of a collected detector data by means of the evaluation system.
- the method further includes a registration of the first image with the second image.
- a minimization of the dissimilarity or a maximization of the similarity can be used.
- a comparison function is used for the minimization or maximization. Comparison functions can be the ones listed above or other comparison functions.
- a method for image generation includes registering a third image with a second image, registering the first image with a third image, registering the first image with the second image by means of registering the first image with a third image.
- the second image is an anatomical image. In other embodiments, the second image is a body-functional image.
- An anatomical image can be one of the anatomical images described above or be a different anatomical image.
- a body-functional image can be a body-functional image as described above or be a different body-functional image.
- embodiments of the invention provide methods and devices for quality control of the detector data as well as of the generated images.
- quality control takes place continuously. In this way, the quality and validity of a generated image is checked. In further embodiments, the quality control takes place already during the detection period.
- FIG. 12 shows a typical process of quality control according to embodiments of the invention.
- a time axis 620 is shown, symbolizing the course of time (from left to right).
- a detection period 622 is further shown.
- a quality determination period 624 is depicted.
- the quality determination period 624 starts after the start of the detection period when detector data are already available.
- the quality determination period 624 can end before the detection period, at the same time as the detection period or after the detection period.
- the quality determination period 624 ends after the detection period.
- the distances between marks on the line symbolizing the quality determination period 624 symbolize themselves periods in which a quality determination process takes place, such as for example determination of a quality value by the evaluation unit.
- an alert signal 629 is output if data gathered, respectively collected, by the evaluation unit do not pass quality control.
- a warning signal can be output for example acoustically, optically, haptically or by combinations thereof.
- Such a warning signal can make a user, for example a surgeon, be aware that the images determined from the detector data may not be reliable at least at the instance of time of the output of the warning signal.
- Quality control is typically carried out on the basis of at least one quality criterion.
- a quality value is calculated.
- several quality values can be calculated for one, respectively more, quality criteria, for example if a quality value is determined that depends on a respective imaging region. Therein, for example, the validity or quality of an image can be rejected if such a quality value does not fulfil one ore more quality criteria, i.e. does not satisfy them.
- an image can be regarded as valid if a quality value satisfies a quality criterion or satisfies several quality criteria.
- an image can also be understood as a specified imaging region associated to the respective quality value.
- Examples for a quality criteria are the following: the similarity between a first and a second image, wherein one of the images or both of the images may be generated from detector data; the conditioning of an image generation rule for generating an image; the relevance of data, such as detector data, for an image element; the plausibility of the image generation from data, such as detector data or data from a second image; the uniformity of data, such as detector data; or the risk of false generation because of faulty data, such as detector data.
- the similarity between a first and a second image can be determined similarly as in the case of registration.
- already registered images can again be compared with each other for similarity.
- the images can therein have been registered by direct image registration or by data registration.
- the images can for example be anatomical or organ-functional images as the ones described above, or others.
- the conditioning of the image generation rule can be given by the conditioning of the imaging matrix or the system matrix.
- the conditioning number of the imaging matrix H (see above) can be calculated.
- a conditioning number can be calculated by analysis of the spectrum of the singular values of the matrix or by similar matrix decomposition measures (for example relation of largest to smallest eigenvalue or number of eigenvalues being above or below a threshold value).
- the quality criterion is a threshold value for the conditioning number. If the calculated conditioning number, i.e.
- a quality value is smaller (respectively larger, depending on the definition of the conditioning number) then the threshold value, the data, such as detector data, do not fulfil the quality criterion, and therefore an image generated therefrom is rejected. If, on the other hand, the calculated conditioning number is larger (respectively smaller) then the threshold value, the quality of the data, such as detector data, and an image reconstructed therefrom are accepted.
- the quantity named with the technical term sparsity of a matrix row or of a matrix column can be a quality value, and a threshold value with respect to this quantity can be used as a quality criterion.
- a row or a column of a matrix is sparse if less than a number of entries determined by the threshold value are different from zero (respectively from numerical zero, i.e., smaller than a given epsilon-threshold). If a matrix column is too sparse an image element depends on two few measurements, and therefore a high risk of false generation exists for this image element. If a matrix row is too sparse the measurement value associated with this row is responsible for two few image elements, which again may lead to a high risk or false generation.
- the relevance of data for an image element can be used as a quality criterion.
- the relevance of a row or column can for example be associated with a threshold value for the sum of all entries of the row or column.
- the plausibility of an image generation for example takes into account a constraint.
- constraints are the maximal amount of radiation, the gradient of the sum of maximal radiation, minimal radiation, radiation associated to image elements that obviously cannot contain radiation sources (for example regions filled with air), and others.
- a corresponding quality value can be associated.
- the uniformity of detector data is determined by the spatial distribution of measurements. Uniform measurements are present if the measurements are distributed uniformly around the region to be reconstructed. A measure for uniformity is formed by the deviation of the actual measurements from a completely uniform measurement. A quality criterion is formed by a threshold value with respect to this uniformity.
- a quality control based on the quality criteria named above, or on others, is carried out successively, preferably quasi-continuously (as shown in FIG. 12 ).
- the results of quality control is output to a user by the output system.
- the output can be visual, acoustical or haptic.
- the output can take place by a coarsening of the image resolution in the corresponding image region. Thereby, a user is prevented from putting false confidence into possibly faulty images.
- a method for image generation by means of an image generating apparatus is provided. The method includes detecting radiation by means of a detector. Detecting can take place during a detection period.
- the radiation can be nuclear radiation.
- the detector can be movable.
- the detector can be freely movable.
- the detector can be portable in the hand.
- the method further includes collecting detector data for image generation by means of an evaluation system of the image generating apparatus.
- the detector data include information about the detected radiation.
- the detector data comprise information about the position and/or orientation of the detector.
- the method further includes determining at least one quality value from the collected detector data by means of the evaluation system. In typical embodiments the determination is a repeated determination or a continuous determination, typically during the detection period.
- the method includes again, repeatedly, or continuously collecting detector data for image generation by means of the evaluation system of the image generating apparatus, preferably during the detection period.
- the at least one quality value is determined with respect to at least one quality criterion.
- a quality criterion can for example be the similarity between a first image generated from the collected detector data and a second image, the conditioning of an image generation rule for image generation from the collected detector data, the relevance of the collected detector data for an image element, the plausibility of image generation from the collected detector data, the uniformity of the collected detector data, or the risk of false generation because of faulty detector data.
- further quality criteria may be used.
- the method includes outputting the at least one determined quality value to a user. Further, other embodiments include outputting a warning to a user if the at least one quality value does not fulfil at least one quality criterion. Outputting the quality value or the warning can take place visually, acoustically, haptically, or by combinations thereof.
- an image generating apparatus for image generation includes a detector for a detection of radiation.
- the detector can be a detector for detecting radiation during a detection period.
- the detector can be movable.
- the detector can be freely movable.
- the detector can be variable in the hand.
- the detector can be nuclear radiation.
- the image generating apparatus further includes an evaluation system.
- the evaluation system includes an interface system for transmitting detector data for image generation to the evaluation system.
- detector data include information about the detected nuclear radiation.
- the detector data further include information about the position and/or orientation of the detector.
- the evaluation system further includes a data memory portion for storing the detector data.
- the evaluation system further includes a program memory portion with a program for determining at least one quality value with respect to image generation from the detector data.
- the program can also be a program for again, repeatedly, or continuously determining at least one quality value with respect to image generation from the detector data. Therein, determining at least one quality value can take place during a detection period.
- the interface system is an interface system for again, repeatedly, or continuously transmitting detector data to the evaluation system.
- the transmission can take place during the detection period.
- the detector data can include information about the detected radiation.
- the detector data can also include information about the position and/or orientation of the detector.
- the program for determining at least one quality value is a program for determining, again determining, repeatedly determining, or continuously determining a quality value with respect to at least one quality criterion.
- the image generating apparatus for image generation further includes an output system, which includes at least one output unit.
- the output unit is an output unit for outputting the at least one determined quality value to a user.
- the output unit or a further output unit is an output unit for outputting a warning to a user if the at least one quality value does not fulfil at least one quality criterion.
- the one output unit or the other output unit can be output units for instructions or warnings to the user in visual, acoustical, or haptical form, or in combination forms thereof.
- the outputs can be combined with an instruction to a user for improving the quality value, as described further below.
- a quality is continuously enhanced.
- the quality can already be enhanced, or continuously enhanced, during the detection period.
- the image generation takes place on the basis of a linear image generation rule.
- This image generation can for example take place by applying an imaging matrix or system matrix H to a vector f, wherein H and f have denotations explained above.
- the image generation also called reconstruction, takes place by minimization of the distance between the vector g and the vector g measured as a function of f, as described above.
- An improvement of image generation can take place by different ways that include: improving the starting value of vector f in the minimization problem, enhancing the image generation rule, in particular the image matrix H.
- a vector f start can for example be used, the contained information of which is derived from a given image, for example from a pre-operative anatomical or organ-functional image. This helps to avoid getting a wrong solution while solving the minimization problem (such as being trapped in a local minimum that does not correspond to the desired solution). Also, the computing time can be decreased because one starts with a nearly correct solution already. In this way, a good solution of the minimization problem, i.e. a good image f, can be obtained with reduced effort.
- An improvement of the imaging rule, respectively of the imaging matrix H can in particular take place by calculating at least one quality value, wherein the quality value is the same as in the quality control of data described above, or can be a further quality value.
- the imaging matrix H is modified while taking into account a quality value.
- the imaging matrix H is modified in such a way that the modified matrix H better satisfies one or several quality criteria.
- rows or columns can be eliminated which have been recognized as being too sparsely filled according to a threshold with respect to the sparsity of a matrix.
- rows or columns of the imaging matrix H can be eliminated which do not satisfy the criterion of relevance.
- such rows or columns can be combined, whereby also the corresponding image elements (entries of f), respectively detector measurement values (entries of g), are correspondingly combined.
- the uniformity can further be improved, for example by combining the detector data of neighbouring measurements such that rather uniformly distributed effective measurements are obtained.
- the imaging matrix becomes smaller, and also for this reason the reconstruction is numerically better solvable.
- a higher weight can be attributed to the entries averaged from several values, which takes into account their higher statistical significance. For example, the contribution of such entries to the distance norm
- this surface can for example be the surface of the body of a patient.
- This surface can be scanned by a laser range scanner, a laser surface pattern scanner, a laser pointer surface scanner, stereoscopic camera systems, time-of-flight cameras and further surface capturing systems.
- Such surface information can also be determined on the basis of the geometry of an object tracked by the tracking system and its tracked trajectory: if an object cannot penetrate into the patients tissue, then the spatial areas traversed by this object must be filled with air and can hence not contain any radiation sources.
- this object can be formed by the detector itself or be integral with the detector.
- constraints can be set on the basis of the knowledge of the anatomy and can be taken into account.
- a constraint can for example be that body part such as bones or the air tube (which, for example with a certain tracer, cannot form nuclear radiation sources) cannot show any radiation activity. In this way, possible mappings can be eliminated that would falsely ascribe a radiation activity to such regions.
- Anatomical information can for example be obtained by anatomic images captured before. These can be registered with current data. Also, standard data can be used, for example from anatomic atlases, which can also be registered with currently generated images. Anatomical information can also be presently obtained by further detectors of the detection system, such as for example ultrasonic devices, computer tomographs, radiographs, optical cameras, magnetic resonance tomography devices, and others.
- the detection system can include further radiation detectors. Further detector data can also be used for image generation. Further detectors can be radiation detectors, in particular radiation detectors for nuclear radiation. The further detectors can be movable radiation detectors. The further radiation detectors can also be fixed radiation detectors. For instance, the table on which the radiation distribution lies may include a gamma camera. In further embodiments, floor, sealing, and/or wall-mounted detectors are used.
- a method for image generation by means of an image generating apparatus includes detecting radiation by means of the detector of the image generating apparatus. Detecting can take place during a detection period.
- the radiation can be nuclear radiation.
- the detector can be movable.
- the detector can be freely movable.
- the detector can be handheld.
- the method further includes collecting detector data for image generation by means of an evaluation system of the image generating apparatus. Typically, the detector data include information about the detected radiation. Typically, detector data also include information about the position and/or orientation of the detector.
- the method further includes determining an image generation rule by means of the evaluation system on the basis of the collected detector data.
- the method further includes modifying the image generation rule on the basis of at least one quality value. In typical embodiments, the modification is a repeated or continuous modification of the image generation rule. Typically, modifying takes place during the detection period.
- the collection of detector data is an anew, repeated, or continuous collection of detector data.
- the determination of an image generation rule is an anew, repeated, or continuous determination of an image generation rule. Typically, determining, again determining, repeatedly determining, or continuously determining takes place during a detection period.
- the at least one quality value is determined with respect to at least one quality criterion.
- Quality criteria can be the same as the ones described in the section about the quality control, or can be further quality criteria.
- Further quality criteria can be criteria on the basis of constraints. Such constraints can consist of using surface information, anatomical information or other information. Also, use of further radiation detectors can be made, and thus further detector data for modifying, again modifying, repeatedly modifying or continuously modifying the image generation rule can be used.
- an image generating apparatus for image generation includes a detector for detecting radiation.
- the detector can be a detector for detecting during a detection period.
- the detector can be movable.
- the detector can be freely movable.
- the detector can be handheld.
- the radiation can be nuclear radiation.
- the image generating apparatus further includes an evaluation system.
- the evaluation system includes an interface system for transmitting detector data to the evaluation system for image generation. Detector data typically also include information about the position and/or orientation of the detector.
- the evaluation system further includes a data memory portion for storing detector data.
- the evaluation system further includes a program memory portion with a program for determining an image generation rule on the basis of the collected detector data.
- the evaluation system further includes a program memory portion with a program for modifying the image generation rule on the basis of at least one quality value.
- the modification is typically an anew, repeated, or continuous modification of the image generation rule.
- the program for modifying, again modifying, repeatedly modifying or continuously modifying the image generation rule is, according to typical embodiments, a program for modifying, again modifying, repeatedly modifying, or continuously modifying the image generation rule on the basis of at least one quality value during the detection period.
- the interface system is an interface system for again, repeatedly, or continuously transmitting detector data to the evaluation system.
- the transmission is a transmission during a detection period.
- the program for determining at least one quality value is a program for determining at least one quality value with respect to at least one quality criterion.
- Quality criteria can be the ones described above in the section “quality control”, or can be other quality criteria.
- the image generating apparatus further includes an output system with at least one output unit.
- the output unit is an output unit for outputting the at least one determined quality value to a user.
- the output unit is an output unit for outputting a warning to a user if the at least one quality value does not satisfy at least one quality criterion. Outputting a quality value or a warning to a user can take place in visual, acoustical, or haptic form, or in a combination form thereof.
- Embodiments of the present invention include outputting an instruction to a user.
- a user can be a human user.
- a user can also be another living being.
- a user can also be an inanimate object, for example a machine.
- typical embodiments include outputting of an instruction to a user for further moving the detector in dependence on the detector data already collected.
- Typical embodiments include a continuous instruction for detection on the basis of a continuous quality control, which has been described above.
- the output takes place by means of the output system, in particular in optical, acoustical or haptic form, or by combinations thereof. Specifically, instructions for further movement of the detector are given in such a way that, when followed, a quality of the collected detector data is improved.
- instruction for further moving the detector in dependence of the collected data is output such that, if followed, the quality of the detector data is presumably enhanced the most. Instructions can for example take place in form of outputting an arrow pointing in the direction in which further measurements shall be made.
- the calculation of the current quality or rating or validity of the collected detector data precedes the outputting of an instruction, and also a calculation how the quality of the data would change if further detector data were available, in particular detector data with information about the detected radiation measured from different orientations or positions of the detector.
- FIG. 13 shows iterative method steps according to embodiments of the invention.
- One of the iterative steps is a movement of the detector.
- a freely movable, for example carryable detector is used.
- detection 614 of radiation by the detector takes place.
- a collection 615 of detector data with information about the detected radiation is carried out by the evaluation system.
- further detector data such as position and/or orientation of the detector collected, normally synchronized with the detector data with information about the detected radiation.
- a determination 616 of a quality criterion takes place by the evaluation unit.
- an output 618 of an instruction to a user takes place.
- the output 618 instructs a user to move the detector in a way that a movement corresponding to the instruction leads to the subsequent measurement of suitable detector data.
- Suitable detector data are typically detector data that enhance image generation.
- an output for example in acoustical form, can be represented in form of an intensifying signal sound.
- An output in haptic form can, for example, be the provision of a sensation of resistance or of being pulled. This sensation can for example be effected by mechanical guidance or by electrical stimulation of muscles or of the brain.
- anatomical or organ-functional images can also be used.
- a method for image generation by means of an image generating apparatus includes detecting radiation by means of a detector of the image generating apparatus. Detection can take place during a detection period.
- the radiation can be nuclear radiation.
- the detector can be movable.
- the detector can be freely movable.
- the detector can be handheld.
- the method further includes collecting detector data for image generation by means of an evaluation system of the image generating apparatus.
- detector data include information about the detected radiation.
- the detector data also include information about the position and/or orientation of the detector.
- the method further includes outputting an instruction to a user for further moving the detector in dependence of the collected detector data. According to typical embodiments, the instruction relates to at least a part of the remaining detection period.
- the collection is an anew, repeated, or continuous collection of detector data.
- outputting an instruction is again, repeatedly, or continuously outputting an instruction to a user for further moving the radiation detector.
- the outputting, anew outputting, repeated outputting or continuous outputting of an instruction to a user for further moving the radiation detector includes outputting the position and/or orientation of the detector that, if adopted by the detector, would enhance image generation according to at least one quality value in a accordance with a prediction.
- the positions and/or orientations are output, which, if assumed by the detector, would most enhance image generation according to a quality value in light of a prediction. Outputting can take place visually, acoustically, haptically, or by combinations thereof.
- an image generating apparatus for image generation includes a detector for detecting radiation.
- the detector can be a detector for detecting radiation during a detection period.
- the detector can be movable, freely movable, or handheld.
- the radiation can be nuclear radiation.
- the image generating apparatus further includes an evaluation system.
- the evaluation system includes an interface system for transmitting detector data for image generation to the evaluation system. Detector data typically include information about the detected radiation. Detector data typically also include information about the position and/or orientation of the detector.
- the evaluation system further includes a data memory portion for storing the detector data.
- the image generating apparatus further includes an output system for outputting an instruction to a user how to further move the detector in dependence of the detector data. In typical embodiments, the instruction relates to at least a part of the remaining detection period.
- the interface system is an interface system for again, repeatedly, or continuously transmitting detector data to the evaluation system.
- the output system for outputting an instruction to a user is an output system for outputting an anew, repeated, or continuous instruction to a user for further moving the detector in dependence of the detector data.
- the instructions relate to at least a part of the remaining detection period.
- the output unit is an output unit for outputting the position and/or orientation of the detector which, if assumed by the detector, would enhance, and preferably most enhance, image generation according to at least one quality value in accordance with a prediction.
- the output unit can be an output unit for an output in visual, acoustical, or haptic form or in a combination form thereof.
- Intrinsic problems of processing detector data arise because measurements can take place in principle at each instant in time and with arbitrary position and/or orientation of the detector. Thereby, data may be gathered while the detector is not pointed towards the radiation source that is to be detected. Similarly, further sources for unsuitable data exist. Such data can deteriorate image generation. Such unsuitable data can deteriorate an imaging matrix, for example with respect to relevance or sparsity.
- FIG. 14 shows a freely movable detector 110 being moved along an arbitrary trajectory. The movement direction is indicated by arrows along the trajectory. Positions and orientations that follow in time to a first position and orientation are depicted with dashed lines.
- the detector 110 measures the emissions of a radiation source 10 within a spatial region 30 at different, generally arbitrary instances in time.
- the radiation source 10 can for example be a nuclear radiation distribution in the body of a living being.
- FIG. 14 shows at least one position and orientation 630 of the detector which presumably leads to unsuitable detector data with respect to the measured radiation. Unsuitable detector data typically deteriorate image generation.
- a quality control and/or an active enhancement of the image generation rule takes place during the detection period, in contrast to a post selection.
- quality control takes place repeatedly or successively, typically quasi-continuously or continuously.
- the enhancement of an imaging rule can take place repeatedly or successively, typically quasi-continuously or continuously.
- An enhancement can take place as for example described in the section “enhancing image generation” or in another way.
- a method for image generation by means of an image generating apparatus includes detecting radiation by means of a movable detector of the image generating apparatus. Typically, detecting takes place during a detection period.
- the detector can be freely movable.
- the detector can be handheld.
- the radiation can be nuclear radiation.
- the method further includes changing the position and/or orientation of the detector. In typical embodiments, changing the position and/or orientation of the detector takes place during the detection period. Changing can be freely changing the position and/or orientation of the detector. Changing can also be again, repeatedly, or continuously changing.
- the method further includes collecting detector data for image generation by means of the evaluation system of the image generating apparatus. Typically, collecting is again, repeatedly, or continuously collecting detector data. Typically, collecting takes place during the detection period.
- the detector data usually include information about the detected radiation.
- the detector data usually also include information about the position and/or orientation of the detector.
- the method further includes determining at least one quality value from the collected detector data by means of the evaluation system.
- determining at least one quality value takes place again, repeatedly, or continuously, typically during the detection period.
- the at least one quality value is determined with respect to at least one quality criterion.
- Quality criteria can for example be the quality criteria described in the section “quality control”, or can be other quality criteria.
- an image generating apparatus for image generation includes a movable detector for detecting radiation.
- the movable detector is, according to typical embodiments, a detector for detecting radiation during a detection period.
- the detector can be freely movable.
- the detector can be handheld.
- the radiation can be nuclear radiation.
- the image generating apparatus further includes an evaluation system.
- the evaluation system includes an interface system for continuously transmitting detector data for image generation to the evaluation system.
- detector data include information about the detected radiation.
- detector data include also information about the position and/or orientation of the detector.
- the interface system is an interface system for continuously transmitting the detector data during the detection period.
- the evaluation system further includes a data memory for storing the detector data.
- the evaluation system further includes a program memory portion with a program for determining at least one quality value with respect to image generation from the detector data.
- the program for determining at least one quality value is a program for again, repeatedly, or continuously determining at least one quality value with respect to image generation from the detector data.
- the program for determining at least one quality value is a program for determining, again determining, repeatedly determining, or continuously determining at least one quality value with respect to image generation from the detector data during the detection period.
- the program for determining at least one quality value is a program for determining at least one quality value with respect to at least one quality criterion.
- the at least one quality criterion can be a quality criterion as described in the section “quality control”, or can be a further quality criterion.
- the method for image generation includes generating an image by minimization of the dissimilarity or maximization of the similarity, wherein preferably at least one reconstruction method for minimization or maximization is used.
- the at least one reconstruction method can be an algebraic reconstruction method, short ART, a maximum likelihood expectation value maximization algorithm, short MLEM, an iterative matrix inversion method such as the Jacobi method, the Gauss-Seidel method, or the over-relaxation method, a direct matrix inversion method such as the singular value decomposition or a regularised matrix inversion method such as the singular value decomposition with Tikhonov regularization.
- the method for image generation is a method for image generation for medical purposes.
- the method for image generation includes collecting body data of a living being by means of the evaluation system.
- body data include respiration frequency and/or heartbeat frequency.
- the body data also include data with respect to form, position and/or orientation of the body.
- the body data with respect to respiration frequency and/or heartbeat frequency are synchronized with the body data with respect to form, position and/or orientation of the body, and are collected in synchronized way.
- the gathering of body data of the living being can for example be effected by the tracking system.
- the method for image generation further includes modifying the image generation rule on the basis of the collected body data.
- movements of the body for example by respiration or heartbeat, can be taken into account for image generation.
- registration of images or the registration of detector data is facilitated thereby.
- the method for image generation includes gathering of data of at least one instrument, preferably a medical instrument, by means of the evaluation system.
- the method further includes a registration of data of medical instruments with respect to data and/or simulation detector data by means of the evaluation system.
- the method further includes generation of a combination image on the basis of the registration.
- the method further includes a tracking of data of medical instruments by the tracking system.
- the method includes generating an instrument image on the basis of the collected instrument data by means of the evaluation system.
- the method further includes a registration of the instrument image with the first image and/or the second image and/or the third image and/or with an already registered image. Further, the method typically includes generating a combination image on the basis of the registration.
- the method includes outputting a combination image by means of the output system.
- the method includes instructing a user, on the basis of the combination image, how to use the medical instruments.
- the method includes guiding a user while using the medical instruments by means of a guiding system on the basis of the instrument data.
- the guiding system can include a guiding unit guiding a user in haptic, acoustic or visual way, or by combinations thereof.
- instructing the user, on the basis of a combination image, on how to use the medical instruments, or guiding the user while using the medical instruments by a guiding system can take place for example by visualization of a virtual reality, visualization of an augmented reality, by layer and multi-layer visualization, frequency-modulated sound, amplitude-modulated sound, pulse-modulated sound, by combinations thereof, or in any other way.
- the method for image generation includes positioning the living being. Positioning can take place for example by a positioning system which includes a positioning unit. Such a positioning unit can position the living being in any desired position and/or orientation according to some embodiments.
- the image generating apparatus for image generation is an image generating apparatus for image generation for medical purposes.
- the image generating apparatus includes at least one sensor for detecting body data of a living being.
- the body data include respiration frequency and/or heartbeat frequency of the living being.
- the image generating apparatus includes a tracking unit for gathering body data of the living being.
- the body data include the form, position and/or orientation of the body.
- the evaluation system further includes a program memory portion with a program for synchronized collection of body data of the living being.
- the evaluation system further includes a data memory portion for storing the synchronized body data of the living being.
- the evaluation system further includes a program portion with a program for modifying the image generation rule on the basis of the collected body data.
- the evaluation system of the image generating apparatus further includes an interface for collecting data of at least one instrument, typically of at least one medical instrument. Further, the evaluation system includes, according to embodiments of the invention, a program memory portion with a program for generating an instrument image on the basis of the instrument data.
- the evaluation system includes a program memory portion with a program for registering data of medical instruments with detector data and/or simulation detector data. Further, the evaluation system includes a program memory portion with program for generating a combination image on the basis of the output of the program for registering the data of medical instruments according to some embodiments.
- the evaluation system includes a program memory portion with a program for registering the instrument image with the first image and/or the second image and/or the third image and/or with an already registered image. Further, the evaluation system typically includes a program memory portion with a program for generating a combination image on the basis of the output of the program for registering the instrument image.
- the output system of the image generating apparatus includes an output unit for output of the combination image.
- the output system includes an output unit for instructing a user how to use the medical instruments on the basis of the combination image.
- the image generating apparatus includes a guiding system for guiding the user while using the medical instruments on the basis of the instrument data.
- the guiding system includes at least one guiding unit.
- the output unit for instructing a user how to use the medical instruments on the basis of the combination image as well as the guiding system for guiding the user while using the medical instruments can communicate signals to the user in haptic, acoustic, or visual form, or in a combination form thereof.
- the output unit can also be identical with the guiding unit of the guiding system.
- the output unit can also be different from the guiding unit of the guiding system.
- the output unit and/or the guiding unit can be units for visualization of a virtual reality, for visualization of an augmented reality, for layer and multilayer visualization, for frequency-modulated sound output, for amplitude-modulated sound output, for pulse-modulated sound output, or for output of combinations thereof, or can be units for output in a different way.
- the image generating apparatus further includes a positioning system for positioning the living being.
- the positioning system includes at least one positioning unit.
- the positioning unit can position the living being in any desired position and/or orientation in space.
- a device for intra-operative 3D-nuclear imaging, 3D-visualization and image-guided surgery based on pre-operative data and tracked radiation detectors wherein the device includes: (a) a radiation detector; (b) a tracking system for synchronously tracking the position and orientation of said radiation detector and for readout; (c) a pre-operative nuclear image; (d) a data processing system which communicates with the radiation detector and with the tracking system and is adapted to read the pre-operative nuclear image for allowing a three dimensional reconstruction of the nuclear image and/or the computation of a corresponding quality value from a list of readout data, positions and orientations of the radiation device and the pre-operative nuclear image; and (e) a display for displaying the reconstructed image.
- a device for intra-operative three dimensional nuclear imaging, 3D-visualization and image-guided surgery based on pre-operative data and tracked radiation detectors including: (a) a radiation detector; (b) a tracking system for tracking the position and orientation of the radiation detector and of its readout data in synchronized form; (c) a pre-operative nuclear image; (d) a data processing system which communicates with the radiation detector and with the tracking system and is able to read the pre-operative nuclear image for allowing the spatial registration of the list of readout data, positions and orientations of the radiation device; and (e) display for displaying the registered images.
- a device for intra-operative three dimensional nuclear imaging, three dimensional visualization and image-guided surgery based on pre-operative data and tracked radiation detectors as described in the embodiments 1 or 2, further including: (a) artificial markings which are positioned on or in the body part to be images; and (b) a second tracking system, which is the same as the first tracking system or which communicates with the first tracking system, and which determines the position and orientation of the artificial markings and communicates with the data processing unit, such that it allows to calculate the position and orientation of the body part that is imaged and of the radiation detector and allows movement and/or deformation compensation.
- a device for intra-operative three dimensional nuclear imaging, three dimensional visualization and image-guided surgery based on pre-operative data and tracked radiation detectors as described in the embodiments 1 or 2, and also including a calibrated sensor for monitoring the position and orientation of the body part that is imaged, wherein the sensor communicates with the data evaluation unit, such that it allows to calculate the relative position and orientation of the body part this is imaged and of the radiation detector and allows movement and/or deformation compensation.
- a device for intra-operative 3D-nuclear imaging, three dimensional visualization and image-guided surgery based on pre-operative data and tracked radiation detectors as described in the embodiments 1 or 2, and also including a sensor for monitoring the respiration and a heart signal of the patient, wherein the sensor communicates with the data processing unit, such that a phase label is attached to each readout, position and orientation of the radiation detector, such that movement and/or deformation compensation for respiration, heartbeat, or both is possible.
- a device for intra-operative 3D-nuclear imaging, 3D-visualization and image-guided surgery based on pre-operative data and tracked radiation detectors as described in any of the preceding or following embodiments, further including: (a) at least one surgical instrument and (b) a third tracking system for tracking the surgical instrument, wherein the third tracking system is the same as the first tracking system or communicates with a first tracking system, such that the relative position and orientation of the surgical instrument and of the reconstructed three dimensional image or registered pre-operative image can be calculated and can be used for (a) guiding instruments to regions of increased accumulation; (b) guiding instruments away from regions of increased accumulation; (c) guiding instruments to regions of low accumulation; (d) guiding instruments away from regions of low accumulation; (e) simulating, at the tip of each instrument, the radiation readout that each instrument would give if it were a gamma probe; (f) displaying surgical instruments on the display; and/or (g) detecting when the validity of the images is lost because of the operation in
- a device for intra-operative 3D-nuclear imaging, 3D-visualization and image-guided surgery based on pre-operative data and tracked radiation detectors as described in any of the preceding or following embodiments, further including: (a) a display of virtual reality and/or (b) a display of augmented reality, such that the reconstructed 3D-gamma-emitting images and the registered pre-operative images can be displayed three dimensionally in visual, acoustic, haptic or in a combined way, and/or in particular spatially registered with the image geometry of some camera, wherein the camera includes laparoscope cameras and cameras based on surgical microscopes, optical and image-transparent head-mounted displays, optical and image-transparent, stereoscopic surgical microscopes, optical and image-transparent displays.
- a device for intra-operative 3D-nuclear imaging, 3D-visualization and image-guided surgery based on pre-operative data and tracked radiation detectors as described in any of the preceding or any of the following embodiments, wherein the radiation detector is one of the following: gamma probe; beta probe; gamma camera; beta camera; mini gamma camera; or a combination thereof.
- the tracking systems are external tracking systems, for example including optical tracking systems, magnetic tracking systems, mechanical or robot arm-based systems, radio wave-based tracking systems, sound wave-based tracking systems, etc.
- internal tracking systems which for example include acceleration detector-based tracking systems, potentiometer-based tracking systems, etc., or a combination of external tracking systems and/or internal tracking systems.
- monitor systems which for example include: monitors, optically transparent monitors, stereo monitors, stereo-optically transparent head mounted displays, etc.
- acoustical displays which for example include frequency-coded feedback systems, pulse-coded feedback systems, etc.
- a device for intra-operative 3D-nuclear imaging, 3D-visualization and image-guided surgery based on pre-operative data and tracked radiation detectors as described in any of the preceding or following embodiments, further including: (a) a memory system for the involved information, which communicates with a first and second data processing unit and/or (b) a third data processing unit, which communicates with a first and second data processing unit, such that the full information or a part thereof is stored as documentation material and/or an automatic report of the procedure is generated.
- a device for intra-operative 3D-nuclear imaging, 3D-visualization and image-guided operation based on pre-operative data and tracked radiation detectors as described in any of the preceding or following embodiments, and further including a sensor and/or a further data processing unit, which can be the same as the first data processing unit or can communicate with a first data processing unit for the online calculation or the tracking of errors in the position and orientation of any of the tracked objects and/or errors in the readout of the radiation record.
- a method for intra-operative, 3D-nuclear imaging, 3D-visualization and image-guided surgery, based on pre-operative data and tracked radiation detectors including: (a) detection of radiation by means of a radiation detector; (b) synchronized tracking of the position and orientation of the radiation detector and its readings; (c) readout of at least one pre-operative nuclear image; (d) 3D-reconstruction of a nuclear image from a list of readings, positions and orientations of the radiation device and of the pre-operative nuclear image and/or the computation of a corresponding quality value; and (e) displaying the reconstructed image.
- a method for intra-operative 3D-nuclear imaging, 3D-visualization and image-guided operation, based on pre-operative data and tracked radiation detectors including: (a) detection of radiation by means of a radiation detector; (b) synchronized tracking of position and orientation of the radiation detector and its readings; (c) readout of at least one pre-operative nuclear image; (d) spatially registering a list of readings, positions and orientations of the radiation device; and (e) displaying the registered image.
- a method for intra-operative 3D-nuclear imaging, 3D-visualization and image-guided surgery based on pre-operative data and tracked radiation detectors as described in any of the preceding or following embodiments, further including: (a) using at least one surgical instrument; (b) determining the relative positions and orientations of the surgical instruments and of the reconstructed 3D-image or registered pre-operative image; (c) using this relative position and orientation for (1) guiding instruments to regions of enhanced accumulation, (2) for guiding instruments away from regions of enhanced accumulation, (3) for guiding instruments to regions of low accumulation, (4) for guiding instruments away from regions of low accumulation, (5) for simulating, at the tip of each instrument, the radiation reading which would be given if each instrument were a gamma probe, (6) displaying surgical instruments on the display, and/or (7) for detecting and for warning a surgeon when the validity of the images is lost by the operation in the reconstructed and registered volume by means of the instruments.
- a method for intra-operative 3D-nuclear imaging, 3D-visualization and image-guided surgery based on pre-operative data and tracked radiation detectors as described in any of the preceding or following embodiments, further including: (a) displaying reconstructed images or registered pre-operative images either visually, acoustically, or haptically, or in a combined way in 3D, and/or in particular spatially registered with the imaging geometry of each camera.
- a method for intra-operative 3D-nuclear imaging, 3D-visualization and image-guided surgery based on pre-operative data and tracked radiation detectors as described in any of the preceding or following embodiments, further including: (a) online computation or tracking of errors in the position and orientation of any of the tracked objects and/or of the error in the reading of the radiation display; and (b) displaying the error for a signing a level of confidence to the readings and/or compensating the error for using the gathered information according to the level of confidence, and consequently to be able to correct the error.
- a device for reliable intra-operative 3D-tomographic nuclear imaging, 3D-visualization of radioactive spatial distributions and image-guided surgery by use of radiation detectors includes: (a) a radiation detector; (b) a tracking system for tracking the position and orientation of the radiation detector in a synchronized way; (c) a first data processing unit which communicates with the radiation detector and the tracking system and which is able to evaluate the quality of the gathered data and to determine the necessary projections for reliable 3D-reconstructions; (d) a second data processing unit which communicates with the radiation detector and the tracking system and which is able to carry out a 3D-reconstruction based on the readings of the radiation detector and the corresponding positions and orientations; (e) a display that communicates with the data processing unit and is able to display the necessary projections for a reliable reconstruction to a surgeon and/or for guiding him; (f) a second display that communicates with the data processing unit and is able to display the valid reconstructed 3D-
- a device for reliable intra-operative 3D-nuclear imaging, 3D-visualization of radioactive spatial distributions and image-guided surgery by use of radiation detectors according to embodiment 31, wherein the first and second data processing units are the same or communicate with each other.
- a device for reliable intra-operative 3D-nuclear imaging, 3D-visualization of radioactive spatial distributions and image-guided surgery by use of radiation detectors according to embodiment 31, wherein the tracking system is an external tracking system, which for example includes an optical tracking system, magnetic tracking system, mechanical or robot arm-based tracking system, a radio wave-based tracking system, a sound wave-based tracking system, etc. or an internal tracking system, which for example includes an acceleration detector-based tracking system, a potentiometer-based tracking system, etc., or any combination of an external tracking system and/or internal tracking system.
- the tracking system is an external tracking system, which for example includes an optical tracking system, magnetic tracking system, mechanical or robot arm-based tracking system, a radio wave-based tracking system, a sound wave-based tracking system, etc.
- an internal tracking system which for example includes an acceleration detector-based tracking system, a potentiometer-based tracking system, etc., or any combination of an external tracking system and/or internal tracking system.
- a device for reliable intra-operative 3D-nuclear imaging, 3D-visualization of radioactive spatial distributions and image-guided surgery by use of radiation detectors wherein the display is one of the following: (a) a visual display, for example a monitor system, for example including: monitors and optically transparent monitors, stereo monitors, stereo-optical transparent head-mounted displays, etc.; (b) an acoustical display, for example including frequency-coded feedback systems, pulse-coded feedback systems, etc.; (c) a haptic display, for example including force feedback joysticks, force-torque feedback joysticks etc., or (d) a combination of visual, acoustical and/or haptic displays.
- a visual display for example a monitor system, for example including: monitors and optically transparent monitors, stereo monitors, stereo-optical transparent head-mounted displays, etc.
- an acoustical display for example including frequency-coded feedback systems, pulse-coded feedback systems, etc.
- a method for reliable intra-operative 3D-nuclear imaging, 3D-visualization of radioactive spatial distributions and image-guided surgery by use of radiation detectors including: (a) synchronized collection of readout data of the radiation detector and of the position and orientation of the radiation detector; (b) evaluation of the quality of the collected readout data, positions and/or orientations; (c) calculation of the necessary set of projections, which are needed to allow a reliable 3D-reconstruction; (d) displaying the set or a subset thereof or the information enabling to guide the surgeon to record the needed projections; (e) 3D-reconstruction of a valid 3D-gamma emitting image and/or the calculation of a corresponding quality value.
- a method for reliable intra-operative 3D-nuclear imaging, 3D-visualization of radioactive spatial distributions and image-guided surgery by use of radiation detectors including: (a) at least one surgical instrument; and (b) a second tracking system for tracking surgical instruments, wherein the second tracking system is the same as the first tracking system or communicates with the first tracking system, such that the relative position and orientation of the surgical instruments and of the reconstructed valid 3D-gamma emitting image can be calculated and can be used for (a) guiding the instruments to regions of high accumulation, (b) guiding the instruments away from regions of high accumulation, (c) guiding the instruments to regions of low accumulation, (d) guiding the instruments away from regions of low accumulation, (e) simulating, at the tip of each instrument, the radiation reading which would be given if each instrument was a radiation detector, (f) displaying surgical instruments on the display and/or (g) detecting and warning a surgeon, if the validity of the images is lost because of the invasion in the reconstructed
- a method for reliable intra-operative 3D-nuclear imaging, 3D-visualization of radioactive spatial distributions and image-guided surgery by use of radiation detectors wherein the relative position and orientation of surgical instruments is used for (a) guiding the instruments to regions of high accumulation, (b) guiding the instruments away from regions of high accumulation, (c) guiding the instruments to regions of low accumulation, (d) guiding the instruments away from regions of low accumulation, (e) calculating the radiation readings that surgical instruments at their given positions and orientations would measure if they were used as radiation detectors, (f) displaying the surgical instruments in co-registered form with the reconstructed valid 3D-gamma emitting images on the display, and/or (g) for detecting and for warning the surgeon if the validity of the images is lost by the invasion in the reconstructed volume by means of the instruments.
- a device for reliable intra-operative 3D-nuclear imaging, 3D-visualization of radioactive spatial distributions and image-guided surgery by use of radiation detectors further including: (a) a sensor for monitoring the respiration and the heart signal of a patient, wherein the sensor communicates with a data processing unit; (b) a sensor for determining the position and orientation and/or the deformation of the part of the body which is imaged with the system that communicates with a data processing unit, and/or (c) tracking markings placed on or in the body part that is imaged with the system and a third tracking system, wherein the third tracking system is the same as the first or the second tracking system or communicates with the first or second tracking system or communicates with the data processing units, such that each reading of the radiation detector, of the position and orientation and/or deformation can be calculated in the relation to the body part that is imaged or such that a phase label can be assigned to these with respect to the movement and/or the deformation cycles for
- a device for reliable intra-operative 3D-nuclear imaging, 3D-visualization of radioactive spatial distributions and image-guided surgery by use of radiation detectors further including: (a) monitoring the respiration or a heart signal of the patient, (b) monitoring the position and orientation and/or the deformation of the body part that is imaged with the system, and/or (c) tracking the markings which are placed on or in the body part imaged with a system such that each reading of the radiation detector, position and orientation and/or deformation can be calculated relative to the body part that is imaged, or such that a phase label can be assigned thereto with respect to the movement and/or the deformation cycle for allowing movement and/or deformation compensation in the reconstruction and/or the display.
- a device for reliable intra-operative 3D-nuclear imaging, 3D-visualization of radioactive spatial distributions and image-guided surgery by use of radiation detectors further including: (a) a display of virtual reality and/or (b) a display of augmented reality, such that the reconstructed valid 3D-gamma emitting image can be displayed in 3D in acoustical, visual, or haptic way, or in a combined way, and/or in particular spatially registered with the image geometry of any camera, including laparoscope cameras and cameras based on surgical microscopes, optical and optically transparent head-mounted displays, optical and optically transparent stereoscopic surgical microscopes, optical and optically transparent displays.
- a method for reliable intra-operative 3D-nuclear imaging, 3D-visualization of radioactive spatial distributions and image-guided surgery by use of radiation detectors further including: displaying the reconstructed valid 3D-gamma emitting image on (a) a display of virtual reality and/or (b) a display of augmented reality, such that the image can be displayed in 3D in visual, acoustical, haptical or in a combined way, and/or in particular spatially registered with the image geometry of any camera, including laparoscope cameras and cameras based on surgical microscopes, optical and optically transparent head mounted displays, optical and optically transparent stereoscopic surgical microscopes, optical and optically transparent displays.
- a method for reliable intra-operative 3D-nuclear imaging, 3D-visualization of radioactive spatial distributions and image-guided surgery by use of radiation detectors further including: (a) co-registered acquisition of anatomical or functional images that stem from at least one 3D device and/or (b) use of previously acquired co-registered 3D images that stem from at least one 3D image generating device, such that the reconstructed valid 3D-gamma emitting images can be displayed in co-registered way with the 3D images and/or such that the 3D-gamma emitting images can be corrected with respect to attenuation and/or scattering by use of 3D images.
- a method for reliable intra-operative 3D-nuclear imaging, 3D-visualization of radioactive spatial distributions and image-guided surgery by use of radiation detectors according to any of the embodiments 36, 38, 40, 42, or 44, further including: (a) storing the involved information and/or (b) automatically generating documentation material.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- High Energy & Nuclear Physics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Optics & Photonics (AREA)
- Molecular Biology (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Pathology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- Radiology & Medical Imaging (AREA)
- General Physics & Mathematics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Nuclear Medicine (AREA)
- Measurement Of Radiation (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Applications Claiming Priority (5)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP07010369.2 | 2007-05-24 | ||
| EP07010368 | 2007-05-24 | ||
| EP07010368.4 | 2007-05-24 | ||
| EP07010369 | 2007-05-24 | ||
| PCT/EP2008/056433 WO2008142172A2 (de) | 2007-05-24 | 2008-05-26 | Bilderzeugungsapparat und -methode zur nuklearbildgebung |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2008/056433 A-371-Of-International WO2008142172A2 (de) | 2007-05-24 | 2008-05-26 | Bilderzeugungsapparat und -methode zur nuklearbildgebung |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/738,560 Continuation US9743898B2 (en) | 2007-05-24 | 2015-06-12 | Image formation apparatus and method for nuclear imaging |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20100266171A1 true US20100266171A1 (en) | 2010-10-21 |
Family
ID=39986366
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US12/601,800 Abandoned US20100266171A1 (en) | 2007-05-24 | 2008-05-26 | Image formation apparatus and method for nuclear imaging |
| US14/738,560 Expired - Fee Related US9743898B2 (en) | 2007-05-24 | 2015-06-12 | Image formation apparatus and method for nuclear imaging |
Family Applications After (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/738,560 Expired - Fee Related US9743898B2 (en) | 2007-05-24 | 2015-06-12 | Image formation apparatus and method for nuclear imaging |
Country Status (5)
| Country | Link |
|---|---|
| US (2) | US20100266171A1 (enExample) |
| EP (2) | EP2755051A3 (enExample) |
| JP (2) | JP5437997B2 (enExample) |
| DE (1) | DE102008025151A1 (enExample) |
| WO (1) | WO2008142172A2 (enExample) |
Cited By (33)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20120127203A1 (en) * | 2010-11-18 | 2012-05-24 | Canon Kabushiki Kaisha | Mixed reality display |
| US20120127302A1 (en) * | 2010-11-18 | 2012-05-24 | Canon Kabushiki Kaisha | Mixed reality display |
| CN102630052A (zh) * | 2012-04-16 | 2012-08-08 | 上海交通大学 | 面向实时流的电视节目推荐系统 |
| DE102011121708A1 (de) * | 2011-12-20 | 2013-06-20 | Surgiceye Gmbh | Bilderzeugungsapparat und -Methode zur Nuklearbildgebung |
| US20130245428A1 (en) * | 2012-03-16 | 2013-09-19 | Toshiba Medical Systems Corporation | Patient-probe-operator tracking method and apparatus for ultrasound imaging systems |
| US20140031723A1 (en) * | 2012-07-26 | 2014-01-30 | Infoscitex Corporation | Orientation Tracking System and Method |
| US20140031668A1 (en) * | 2010-09-08 | 2014-01-30 | Disruptive Navigational Technologies, Llc | Surgical and Medical Instrument Tracking Using a Depth-Sensing Device |
| WO2014080013A1 (de) * | 2012-11-23 | 2014-05-30 | Surgiceye Gmbh | Hybrides bildgebungssystem und verfahren für intraoperative, interventionelle und diagnostische anwendungen |
| CN104000655A (zh) * | 2013-02-25 | 2014-08-27 | 西门子公司 | 用于腹腔镜外科手术的组合的表面重构和配准 |
| US20140241600A1 (en) * | 2013-02-25 | 2014-08-28 | Siemens Aktiengesellschaft | Combined surface reconstruction and registration for laparoscopic surgery |
| US8831708B2 (en) | 2011-03-15 | 2014-09-09 | Siemens Aktiengesellschaft | Multi-modal medical imaging |
| WO2014118640A3 (en) * | 2013-01-31 | 2014-11-13 | Novadaq Technologies Inc. | Virtual-reality simulator to provide training for sentinel lymph node surgery using image data and database data |
| JP2014530348A (ja) * | 2011-09-16 | 2014-11-17 | サージックアイ ゲゼルシャフト ミット ベシュレンクテル ハフツング | 元の放射線画像を更新するための放射線画像システム及び方法 |
| US9040925B2 (en) | 2012-12-21 | 2015-05-26 | Canberra Industries, Inc. | Spatially-aware radiation probe system and method |
| US20150257718A1 (en) * | 2012-09-28 | 2015-09-17 | The Regents Of The University Of California | Realtime imaging and radiotherapy of microscopic disease |
| US20150305701A1 (en) * | 2007-05-24 | 2015-10-29 | Surgiceye Gmbh | Image formation apparatus and method for nuclear imaging |
| WO2016012556A1 (de) * | 2014-07-25 | 2016-01-28 | Surgiceye Gmbh | Bilderzeugungsapparat und -verfahren mit kombination von funktionaler bildgebung und ultraschallbildgebung |
| US20160078682A1 (en) * | 2013-04-24 | 2016-03-17 | Kawasaki Jukogyo Kabushiki Kaisha | Component mounting work support system and component mounting method |
| US9561019B2 (en) | 2012-03-07 | 2017-02-07 | Ziteo, Inc. | Methods and systems for tracking and guiding sensors and instruments |
| US9892564B1 (en) * | 2017-03-30 | 2018-02-13 | Novarad Corporation | Augmenting real-time views of a patient with three-dimensional data |
| US20180185113A1 (en) * | 2016-09-09 | 2018-07-05 | GYS Tech, LLC d/b/a Cardan Robotics | Methods and Systems for Display of Patient Data in Computer-Assisted Surgery |
| US20180279883A1 (en) * | 2015-09-29 | 2018-10-04 | Technische Universität München | Apparatus and method for augmented visualization employing X-ray and optical data |
| US10258427B2 (en) * | 2015-12-18 | 2019-04-16 | Orthogrid Systems, Inc. | Mixed reality imaging apparatus and surgical suite |
| US10282871B2 (en) | 2017-07-10 | 2019-05-07 | Shanghai United Imaging Healthcare Co., Ltd. | Systems and methods for pet image reconstruction |
| US10617401B2 (en) | 2014-11-14 | 2020-04-14 | Ziteo, Inc. | Systems for localization of targets inside a body |
| US20200279412A1 (en) * | 2014-07-30 | 2020-09-03 | Navix International Limited | Probe localization |
| US20220172412A1 (en) * | 2020-12-01 | 2022-06-02 | Our United Corporation | Medical image reconstruction method, computer device and storage medium |
| US11402515B2 (en) | 2016-05-31 | 2022-08-02 | David W. Holdsworth | Gamma probe and multimodal intraoperative imaging system |
| US11439358B2 (en) * | 2019-04-09 | 2022-09-13 | Ziteo, Inc. | Methods and systems for high performance and versatile molecular imaging |
| US20230066480A1 (en) * | 2021-09-01 | 2023-03-02 | Mazor Robotics Ltd. | Systems, methods, and devices for generating a corrected image |
| CN116260962A (zh) * | 2023-01-29 | 2023-06-13 | 电信科学技术仪表研究所有限公司 | 一种监控摄像机传感器的耐辐射检测装置及方法 |
| US11726194B2 (en) | 2018-10-08 | 2023-08-15 | Norihiro NANGOU | Imaging apparatus not easily affected by radiation, and image display apparatus |
| US20230401766A1 (en) * | 2021-09-01 | 2023-12-14 | Mazor Robotics Ltd. | Systems, methods, and devices for generating a corrected image |
Families Citing this family (18)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE102012008765A1 (de) | 2012-05-06 | 2013-11-07 | Surgiceye Gmbh | Hybrides Bildgebungssystem für intraoperative und diagnostische Anwendungen |
| US9381376B2 (en) * | 2012-10-12 | 2016-07-05 | Varian Medical Systems International Ag | Systems, devices, and methods for quality assurance of radiation therapy |
| CA2899289A1 (en) * | 2013-02-04 | 2014-08-07 | Novadaq Technologies Inc. | Combined radiationless automated three dimensional patient habitus imaging with scintigraphy |
| DE102014108055A1 (de) | 2014-06-06 | 2015-12-17 | Surgiceye Gmbh | Vorrichtung zum Detektieren einer nuklearen Strahlungsverteilung |
| US9646419B2 (en) * | 2015-01-14 | 2017-05-09 | International Business Machines Corporation | Augmented reality device display of image recognition analysis matches |
| DE102015206511B4 (de) * | 2015-04-13 | 2023-10-19 | Siemens Healthcare Gmbh | Ermittlung einer eindeutigen räumlichen Relation eines medizinischen Geräts zu einem weiteren Objekt |
| US10143430B2 (en) | 2015-06-18 | 2018-12-04 | The Cleveland Clinic Foundation | Systems and methods that use multi-modal imaging for enhanced resolution images |
| DE102015111417A1 (de) | 2015-07-14 | 2017-01-19 | Surgiceye Gmbh | Endoskop für optische und molekulare Bildgebung |
| DE102016105793A1 (de) | 2016-03-30 | 2017-10-05 | Piur Imaging Gmbh | Vorrichtung und Verfahren zur Positionserfassung eines mobilen medizinischen Gerätes |
| IL245339A (en) * | 2016-04-21 | 2017-10-31 | Rani Ben Yishai | Method and system for verification of registration |
| US10251011B2 (en) | 2017-04-24 | 2019-04-02 | Intel Corporation | Augmented reality virtual reality ray tracing sensory enhancement system, apparatus and method |
| JP7226827B2 (ja) * | 2017-06-15 | 2023-02-21 | コンセッホ スペリオル デ インベスティガシオンス サイエンティフィカス(シーエスアイシー) | 被験者の第1の画像および第2の画像を生成するシステム及びシステムの作動方法 |
| DE102020107965B3 (de) * | 2020-03-23 | 2021-09-09 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung eingetragener Verein | Verfahren zur optischen Bestimmung einer Intensitätsverteilung |
| US12004829B2 (en) * | 2020-06-09 | 2024-06-11 | Verb Surgical Inc. | Inverse kinematics of a surgical robot for teleoperation with hardware constraints |
| EP4013048A1 (en) * | 2020-12-08 | 2022-06-15 | Koninklijke Philips N.V. | Object visualization |
| WO2022198010A1 (en) * | 2021-03-19 | 2022-09-22 | Owl Navigation, Inc. | Brain image segmentation using trained convolutional neural networks |
| US12016642B2 (en) | 2021-09-08 | 2024-06-25 | Proprio, Inc. | Constellations for tracking instruments, such as surgical instruments, and associated systems and methods |
| WO2023220605A2 (en) * | 2022-05-09 | 2023-11-16 | Proprio, Inc. | Methods and systems for calibrating instruments within an imaging system, such as a surgical imaging system |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6167296A (en) * | 1996-06-28 | 2000-12-26 | The Board Of Trustees Of The Leland Stanford Junior University | Method for volumetric image navigation |
| US20020156366A1 (en) * | 2001-04-19 | 2002-10-24 | Jeff Stainsby | Realtime MR scan prescription using physiological information |
| US6510336B1 (en) * | 2000-03-03 | 2003-01-21 | Intra Medical Imaging, Llc | Methods and devices to expand applications of intraoperative radiation probes |
| US20040204646A1 (en) * | 2002-11-04 | 2004-10-14 | V-Target Technologies Ltd. | Intracorporeal-imaging head |
| US20060093213A1 (en) * | 2004-10-28 | 2006-05-04 | Eran Steinberg | Method and apparatus for red-eye detection in an acquired digital image based on image quality pre and post filtering |
| US20070236514A1 (en) * | 2006-03-29 | 2007-10-11 | Bracco Imaging Spa | Methods and Apparatuses for Stereoscopic Image Guided Surgical Navigation |
Family Cites Families (25)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5666444A (en) * | 1982-04-06 | 1997-09-09 | Canon Kabushiki Kaisha | Image processing apparatus |
| JP2556031B2 (ja) * | 1987-04-30 | 1996-11-20 | 株式会社島津製作所 | Spect画像再構成装置 |
| JP4049829B2 (ja) * | 1995-06-23 | 2008-02-20 | 株式会社東芝 | 放射線診断装置 |
| US6674916B1 (en) * | 1999-10-18 | 2004-01-06 | Z-Kat, Inc. | Interpolation in transform space for multiple rigid object registration |
| US7068854B1 (en) * | 1999-12-29 | 2006-06-27 | Ge Medical Systems Global Technology Company, Llc | Correction of defective pixels in a detector |
| DE10019955A1 (de) * | 2000-04-20 | 2001-10-25 | Philips Corp Intellectual Pty | Röntgenuntersuchungsgerät und Verfahren zur Erzeugung eines Röntgenbildes |
| IL154323A0 (en) * | 2000-08-21 | 2003-09-17 | Target Technologies Ltd V | Radioactive emission detector equipped with a position tracking system and utilization thereof with medical systems and in medical procedures |
| US8565860B2 (en) * | 2000-08-21 | 2013-10-22 | Biosensors International Group, Ltd. | Radioactive emission detector equipped with a position tracking system |
| JP4031618B2 (ja) * | 2001-02-28 | 2008-01-09 | 安西メディカル株式会社 | 放射線源検出装置 |
| US6631285B2 (en) * | 2001-03-15 | 2003-10-07 | Koninklijke Philips Electronics, N. V. | Fast transform for reconstruction of rotating-slat data |
| JP2005521502A (ja) * | 2002-04-03 | 2005-07-21 | セガミ エス.エー.アール.エル. | 胸部および腹部の画像モダリティの重ね合わせ |
| US6804325B1 (en) * | 2002-10-25 | 2004-10-12 | Southeastern Universities Research Assn. | Method for position emission mammography image reconstruction |
| JP2005013291A (ja) * | 2003-06-23 | 2005-01-20 | Hitachi Medical Corp | 超音波探触子及び超音波診断装置 |
| JP4317412B2 (ja) * | 2003-09-29 | 2009-08-19 | 株式会社日立製作所 | 画像処理方法 |
| JP4493316B2 (ja) * | 2003-10-14 | 2010-06-30 | オリンパス株式会社 | 超音波診断装置および画像処理プログラム |
| JP4592346B2 (ja) * | 2004-07-14 | 2010-12-01 | アロカ株式会社 | 医療診断システム |
| EP1815420B1 (en) * | 2004-11-19 | 2018-01-10 | Koninklijke Philips N.V. | Optimal conversion of 3d image sets between different spaces |
| JP2006198060A (ja) * | 2005-01-19 | 2006-08-03 | Ziosoft Inc | 画像処理方法および画像処理プログラム |
| JP4820561B2 (ja) * | 2005-03-14 | 2011-11-24 | 株式会社東芝 | 核医学診断装置 |
| JP4581088B2 (ja) * | 2005-05-17 | 2010-11-17 | 国立大学法人 筑波大学 | 計算機支援診断装置および方法 |
| US8155415B2 (en) * | 2005-07-01 | 2012-04-10 | Siemens Medical Solutions Usa, Inc. | Extension of truncated CT images for use with emission tomography in multimodality medical images |
| WO2007131561A2 (en) | 2006-05-16 | 2007-11-22 | Surgiceye Gmbh | Method and device for 3d acquisition, 3d visualization and computer guided surgery using nuclear probes |
| JP5437997B2 (ja) * | 2007-05-24 | 2014-03-12 | サージックアイ ゲゼルシャフト ミット ベシュレンクテル ハフツング | 放射性イメージングのための画像生成装置および方法 |
| JP4931238B2 (ja) * | 2007-08-14 | 2012-05-16 | キヤノン株式会社 | 撮像装置及びその駆動方法 |
| DE102011053708A1 (de) * | 2011-09-16 | 2013-03-21 | Surgiceye Gmbh | Nuklearbildsystem und verfahren zum aktualisieren eines ursprünglichen nuklearbilds |
-
2008
- 2008-05-26 JP JP2010508870A patent/JP5437997B2/ja not_active Expired - Fee Related
- 2008-05-26 EP EP14164239.7A patent/EP2755051A3/de not_active Withdrawn
- 2008-05-26 WO PCT/EP2008/056433 patent/WO2008142172A2/de not_active Ceased
- 2008-05-26 US US12/601,800 patent/US20100266171A1/en not_active Abandoned
- 2008-05-26 DE DE102008025151A patent/DE102008025151A1/de not_active Ceased
- 2008-05-26 EP EP08760032.6A patent/EP2165215B1/de not_active Not-in-force
-
2013
- 2013-12-12 JP JP2013256968A patent/JP5976627B2/ja not_active Expired - Fee Related
-
2015
- 2015-06-12 US US14/738,560 patent/US9743898B2/en not_active Expired - Fee Related
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6167296A (en) * | 1996-06-28 | 2000-12-26 | The Board Of Trustees Of The Leland Stanford Junior University | Method for volumetric image navigation |
| US6510336B1 (en) * | 2000-03-03 | 2003-01-21 | Intra Medical Imaging, Llc | Methods and devices to expand applications of intraoperative radiation probes |
| US20020156366A1 (en) * | 2001-04-19 | 2002-10-24 | Jeff Stainsby | Realtime MR scan prescription using physiological information |
| US20040204646A1 (en) * | 2002-11-04 | 2004-10-14 | V-Target Technologies Ltd. | Intracorporeal-imaging head |
| US20060093213A1 (en) * | 2004-10-28 | 2006-05-04 | Eran Steinberg | Method and apparatus for red-eye detection in an acquired digital image based on image quality pre and post filtering |
| US20070236514A1 (en) * | 2006-03-29 | 2007-10-11 | Bracco Imaging Spa | Methods and Apparatuses for Stereoscopic Image Guided Surgical Navigation |
Cited By (65)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20150305701A1 (en) * | 2007-05-24 | 2015-10-29 | Surgiceye Gmbh | Image formation apparatus and method for nuclear imaging |
| US9743898B2 (en) * | 2007-05-24 | 2017-08-29 | Surgiceye Gmbh | Image formation apparatus and method for nuclear imaging |
| US20140031668A1 (en) * | 2010-09-08 | 2014-01-30 | Disruptive Navigational Technologies, Llc | Surgical and Medical Instrument Tracking Using a Depth-Sensing Device |
| US20120127302A1 (en) * | 2010-11-18 | 2012-05-24 | Canon Kabushiki Kaisha | Mixed reality display |
| US20120127203A1 (en) * | 2010-11-18 | 2012-05-24 | Canon Kabushiki Kaisha | Mixed reality display |
| US8831708B2 (en) | 2011-03-15 | 2014-09-09 | Siemens Aktiengesellschaft | Multi-modal medical imaging |
| US20140369560A1 (en) * | 2011-09-16 | 2014-12-18 | Surgiceye Gmbh | Nuclear Image System and Method for Updating an Original Nuclear Image |
| JP2014530348A (ja) * | 2011-09-16 | 2014-11-17 | サージックアイ ゲゼルシャフト ミット ベシュレンクテル ハフツング | 元の放射線画像を更新するための放射線画像システム及び方法 |
| US9286732B2 (en) * | 2011-09-16 | 2016-03-15 | Surgiceye Gmbh | Nuclear image system and method for updating an original nuclear image |
| US20130338490A1 (en) * | 2011-12-20 | 2013-12-19 | Surgiceye Gmbh | Apparatus and method for nuclear imaging |
| US9345441B2 (en) * | 2011-12-20 | 2016-05-24 | Surgiceye Gmbh | Apparatus and method for nuclear imaging |
| EP2606825A1 (de) * | 2011-12-20 | 2013-06-26 | SurgicEye GmbH | Bilderzeugungsapparat und -methode zur Nuklearbildgebung |
| DE102011121708A1 (de) * | 2011-12-20 | 2013-06-20 | Surgiceye Gmbh | Bilderzeugungsapparat und -Methode zur Nuklearbildgebung |
| US9561019B2 (en) | 2012-03-07 | 2017-02-07 | Ziteo, Inc. | Methods and systems for tracking and guiding sensors and instruments |
| US10426350B2 (en) | 2012-03-07 | 2019-10-01 | Ziteo, Inc. | Methods and systems for tracking and guiding sensors and instruments |
| US11678804B2 (en) | 2012-03-07 | 2023-06-20 | Ziteo, Inc. | Methods and systems for tracking and guiding sensors and instruments |
| US20130245428A1 (en) * | 2012-03-16 | 2013-09-19 | Toshiba Medical Systems Corporation | Patient-probe-operator tracking method and apparatus for ultrasound imaging systems |
| US9474505B2 (en) * | 2012-03-16 | 2016-10-25 | Toshiba Medical Systems Corporation | Patient-probe-operator tracking method and apparatus for ultrasound imaging systems |
| CN102630052A (zh) * | 2012-04-16 | 2012-08-08 | 上海交通大学 | 面向实时流的电视节目推荐系统 |
| US8953154B2 (en) * | 2012-07-26 | 2015-02-10 | Vivonics, Inc. | Orientation tracking system and method |
| US20140031723A1 (en) * | 2012-07-26 | 2014-01-30 | Infoscitex Corporation | Orientation Tracking System and Method |
| US20150257718A1 (en) * | 2012-09-28 | 2015-09-17 | The Regents Of The University Of California | Realtime imaging and radiotherapy of microscopic disease |
| WO2014080013A1 (de) * | 2012-11-23 | 2014-05-30 | Surgiceye Gmbh | Hybrides bildgebungssystem und verfahren für intraoperative, interventionelle und diagnostische anwendungen |
| US20150305700A1 (en) * | 2012-11-23 | 2015-10-29 | Surgiceye Gmbh | Hybrid imaging system and method for intraoperative, interventional, and diagnostic applications |
| EP2746815B1 (en) | 2012-12-21 | 2020-05-13 | Mirion Technologies (Canberra), Inc. | Spatially-aware radiation probe system and method |
| US9040925B2 (en) | 2012-12-21 | 2015-05-26 | Canberra Industries, Inc. | Spatially-aware radiation probe system and method |
| WO2014118640A3 (en) * | 2013-01-31 | 2014-11-13 | Novadaq Technologies Inc. | Virtual-reality simulator to provide training for sentinel lymph node surgery using image data and database data |
| CN104000655A (zh) * | 2013-02-25 | 2014-08-27 | 西门子公司 | 用于腹腔镜外科手术的组合的表面重构和配准 |
| US9129422B2 (en) * | 2013-02-25 | 2015-09-08 | Siemens Aktiengesellschaft | Combined surface reconstruction and registration for laparoscopic surgery |
| US20140241600A1 (en) * | 2013-02-25 | 2014-08-28 | Siemens Aktiengesellschaft | Combined surface reconstruction and registration for laparoscopic surgery |
| US20160078682A1 (en) * | 2013-04-24 | 2016-03-17 | Kawasaki Jukogyo Kabushiki Kaisha | Component mounting work support system and component mounting method |
| WO2016012556A1 (de) * | 2014-07-25 | 2016-01-28 | Surgiceye Gmbh | Bilderzeugungsapparat und -verfahren mit kombination von funktionaler bildgebung und ultraschallbildgebung |
| US20200279412A1 (en) * | 2014-07-30 | 2020-09-03 | Navix International Limited | Probe localization |
| US11464503B2 (en) | 2014-11-14 | 2022-10-11 | Ziteo, Inc. | Methods and systems for localization of targets inside a body |
| US20230028501A1 (en) * | 2014-11-14 | 2023-01-26 | Ziteo, Inc. | Methods and systems for localization of targets inside a body |
| US12239301B2 (en) * | 2014-11-14 | 2025-03-04 | Ziteo, Inc. | Methods and systems for localization of targets inside a body |
| US10617401B2 (en) | 2014-11-14 | 2020-04-14 | Ziteo, Inc. | Systems for localization of targets inside a body |
| US20180279883A1 (en) * | 2015-09-29 | 2018-10-04 | Technische Universität München | Apparatus and method for augmented visualization employing X-ray and optical data |
| US11045090B2 (en) * | 2015-09-29 | 2021-06-29 | Technische Universität München | Apparatus and method for augmented visualization employing X-ray and optical data |
| US10258427B2 (en) * | 2015-12-18 | 2019-04-16 | Orthogrid Systems, Inc. | Mixed reality imaging apparatus and surgical suite |
| US11402515B2 (en) | 2016-05-31 | 2022-08-02 | David W. Holdsworth | Gamma probe and multimodal intraoperative imaging system |
| US20180185113A1 (en) * | 2016-09-09 | 2018-07-05 | GYS Tech, LLC d/b/a Cardan Robotics | Methods and Systems for Display of Patient Data in Computer-Assisted Surgery |
| US10653495B2 (en) * | 2016-09-09 | 2020-05-19 | Mobius Imaging Llc | Methods and systems for display of patient data in computer-assisted surgery |
| US11141237B2 (en) | 2016-09-09 | 2021-10-12 | Mobius Imaging Llc | Methods and systems for display of patient data in computer-assisted surgery |
| US12167940B2 (en) | 2016-09-09 | 2024-12-17 | Mobius Imaging, Llc | Methods and systems for display of patient data in computer-assisted surgery |
| US11737850B2 (en) | 2016-09-09 | 2023-08-29 | Mobius Imaging Llc | Methods and systems for display of patient data in computer-assisted surgery |
| US11004271B2 (en) | 2017-03-30 | 2021-05-11 | Novarad Corporation | Augmenting real-time views of a patient with three-dimensional data |
| US10475244B2 (en) * | 2017-03-30 | 2019-11-12 | Novarad Corporation | Augmenting real-time views of a patient with three-dimensional data |
| US9892564B1 (en) * | 2017-03-30 | 2018-02-13 | Novarad Corporation | Augmenting real-time views of a patient with three-dimensional data |
| US11481987B2 (en) | 2017-03-30 | 2022-10-25 | Novarad Corporation | Augmenting real-time views of a patient with three-dimensional data |
| US20180286132A1 (en) * | 2017-03-30 | 2018-10-04 | Novarad Corporation | Augmenting real-time views of a patient with three-dimensional data |
| US10282871B2 (en) | 2017-07-10 | 2019-05-07 | Shanghai United Imaging Healthcare Co., Ltd. | Systems and methods for pet image reconstruction |
| US10565746B2 (en) | 2017-07-10 | 2020-02-18 | Shanghai United Imaging Healthcare Co., Ltd. | Systems and methods for PET image reconstruction |
| US11726194B2 (en) | 2018-10-08 | 2023-08-15 | Norihiro NANGOU | Imaging apparatus not easily affected by radiation, and image display apparatus |
| US12329551B2 (en) | 2019-04-09 | 2025-06-17 | Ziteo, Inc. | Methods and systems for high performance and versatile molecular imaging |
| US11883214B2 (en) | 2019-04-09 | 2024-01-30 | Ziteo, Inc. | Methods and systems for high performance and versatile molecular imaging |
| US11439358B2 (en) * | 2019-04-09 | 2022-09-13 | Ziteo, Inc. | Methods and systems for high performance and versatile molecular imaging |
| US12394117B2 (en) * | 2020-12-01 | 2025-08-19 | Our United Corporation | Medical image reconstruction method, computer device and storage medium |
| US20220172412A1 (en) * | 2020-12-01 | 2022-06-02 | Our United Corporation | Medical image reconstruction method, computer device and storage medium |
| US20230401766A1 (en) * | 2021-09-01 | 2023-12-14 | Mazor Robotics Ltd. | Systems, methods, and devices for generating a corrected image |
| US12067653B2 (en) * | 2021-09-01 | 2024-08-20 | Mazor Robotics Ltd. | Systems, methods, and devices for generating a corrected image |
| US20230316599A1 (en) * | 2021-09-01 | 2023-10-05 | Mazor Robotics Ltd. | Systems, methods, and devices for generating a corrected image |
| US11763499B2 (en) * | 2021-09-01 | 2023-09-19 | Mazor Robotics Ltd. | Systems, methods, and devices for generating a corrected image |
| US20230066480A1 (en) * | 2021-09-01 | 2023-03-02 | Mazor Robotics Ltd. | Systems, methods, and devices for generating a corrected image |
| CN116260962A (zh) * | 2023-01-29 | 2023-06-13 | 电信科学技术仪表研究所有限公司 | 一种监控摄像机传感器的耐辐射检测装置及方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| EP2165215A2 (de) | 2010-03-24 |
| WO2008142172A3 (de) | 2009-04-16 |
| JP2014089198A (ja) | 2014-05-15 |
| US9743898B2 (en) | 2017-08-29 |
| JP5976627B2 (ja) | 2016-08-23 |
| EP2165215B1 (de) | 2014-05-07 |
| DE102008025151A1 (de) | 2008-12-18 |
| EP2755051A2 (de) | 2014-07-16 |
| JP2010528277A (ja) | 2010-08-19 |
| US20150305701A1 (en) | 2015-10-29 |
| EP2755051A3 (de) | 2014-08-27 |
| JP5437997B2 (ja) | 2014-03-12 |
| WO2008142172A2 (de) | 2008-11-27 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9743898B2 (en) | Image formation apparatus and method for nuclear imaging | |
| US12329551B2 (en) | Methods and systems for high performance and versatile molecular imaging | |
| US11504095B2 (en) | Three-dimensional imaging and modeling of ultrasound image data | |
| CN107106241B (zh) | 用于对外科器械进行导航的系统 | |
| US7711406B2 (en) | System and method for detection of electromagnetic radiation by amorphous silicon x-ray detector for metal detection in x-ray imaging | |
| US7831096B2 (en) | Medical navigation system with tool and/or implant integration into fluoroscopic image projections and method of use | |
| US10925567B2 (en) | Adaptive imaging and frame rate optimizing based on real-time shape sensing of medical instruments | |
| KR100971417B1 (ko) | 초음파 영상과 외부 의료영상의 합성 영상 상에 의료용바늘을 디스플레이하기 위한 초음파 시스템 | |
| US9320569B2 (en) | Systems and methods for implant distance measurement | |
| US9345441B2 (en) | Apparatus and method for nuclear imaging | |
| US8131031B2 (en) | Systems and methods for inferred patient annotation | |
| EP3468668B1 (en) | Soft tissue tracking using physiologic volume rendering | |
| JP2014530348A (ja) | 元の放射線画像を更新するための放射線画像システム及び方法 | |
| US20150305700A1 (en) | Hybrid imaging system and method for intraoperative, interventional, and diagnostic applications | |
| WO2008035271A2 (en) | Device for registering a 3d model | |
| US9477686B2 (en) | Systems and methods for annotation and sorting of surgical images | |
| CN1897878A (zh) | 用于把医疗仪器导入到病人体内的系统 | |
| US20240341734A1 (en) | Ultrasound Depth Calibration for Improving Navigational Accuracy | |
| Remes | Registration accuracy of the optical navigation system for image-guided surgery | |
| WO2025011992A1 (en) | Simulated 4d ct surviews using 3d surview and rgb-d camera for improved scan planning | |
| Brack et al. | Application of stereoscopic arc photogrammetry to image-guided radiation therapy and treatment planning |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: SURGICEYE GMBH, GERMANY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WENDLER, THOMAS;NAVAB, NASSIR;TRAUB, JOERG;REEL/FRAME:023567/0100 Effective date: 20091124 |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |