WO2011070465A2 - Method and apparatus for using time of flight information to detect and correct for motion in imaging scans - Google Patents
Method and apparatus for using time of flight information to detect and correct for motion in imaging scans Download PDFInfo
- Publication number
- WO2011070465A2 WO2011070465A2 PCT/IB2010/055248 IB2010055248W WO2011070465A2 WO 2011070465 A2 WO2011070465 A2 WO 2011070465A2 IB 2010055248 W IB2010055248 W IB 2010055248W WO 2011070465 A2 WO2011070465 A2 WO 2011070465A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- respiratory
- motion
- data
- cardiac
- time
- Prior art date
Links
- 230000033001 locomotion Effects 0.000 title claims abstract description 236
- 238000000034 method Methods 0.000 title claims abstract description 122
- 238000003384 imaging method Methods 0.000 title claims description 69
- 230000000241 respiratory effect Effects 0.000 claims abstract description 161
- 230000000747 cardiac effect Effects 0.000 claims abstract description 138
- 238000012879 PET imaging Methods 0.000 claims abstract description 40
- 238000002059 diagnostic imaging Methods 0.000 claims abstract description 23
- 239000013598 vector Substances 0.000 claims description 35
- 230000000694 effects Effects 0.000 claims description 18
- 210000005240 left ventricle Anatomy 0.000 claims description 5
- 238000012804 iterative process Methods 0.000 claims description 4
- 238000009432 framing Methods 0.000 claims description 3
- 238000012544 monitoring process Methods 0.000 claims 2
- 230000005855 radiation Effects 0.000 claims 1
- 238000012636 positron electron tomography Methods 0.000 abstract 1
- 238000002600 positron emission tomography Methods 0.000 description 44
- 238000012937 correction Methods 0.000 description 14
- 230000008569 process Effects 0.000 description 9
- 230000004044 response Effects 0.000 description 8
- 230000006870 function Effects 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 230000002776 aggregation Effects 0.000 description 5
- 238000004220 aggregation Methods 0.000 description 5
- 238000012512 characterization method Methods 0.000 description 5
- 230000008602 contraction Effects 0.000 description 5
- 229940121896 radiopharmaceutical Drugs 0.000 description 4
- 239000012217 radiopharmaceutical Substances 0.000 description 4
- 230000002799 radiopharmaceutical effect Effects 0.000 description 4
- 238000002603 single-photon emission computed tomography Methods 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 239000008280 blood Substances 0.000 description 3
- 210000004369 blood Anatomy 0.000 description 3
- 230000017531 blood circulation Effects 0.000 description 3
- 238000000354 decomposition reaction Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 238000012935 Averaging Methods 0.000 description 2
- 230000004075 alteration Effects 0.000 description 2
- 230000022900 cardiac muscle contraction Effects 0.000 description 2
- 238000013170 computed tomography imaging Methods 0.000 description 2
- 230000006866 deterioration Effects 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 210000004072 lung Anatomy 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000011112 process operation Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000001052 transient effect Effects 0.000 description 2
- 210000001015 abdomen Anatomy 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000012864 cross contamination Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000004153 glucose metabolism Effects 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 230000003387 muscular Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000001959 radiotherapy Methods 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
- 230000002747 voluntary effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
- A61B6/032—Transmission computed tomography [CT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
- A61B6/037—Emission tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5217—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data extracting a diagnostic or physiological parameter from medical diagnostic data
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/50—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
- A61B6/507—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for determination of haemodynamic parameters, e.g. perfusion CT
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/54—Control of apparatus or devices for radiation diagnosis
- A61B6/541—Control of apparatus or devices for radiation diagnosis involving acquisition triggered by a physiological signal
Definitions
- the present application relates generally to the imaging arts. More specifically, it provides methods and apparatuses for using time of flight information to detect motion which occurs during a medical imaging acquisition, such as a positron emission tomography (PET) imaging acquisition.
- PET positron emission tomography
- the present application also provides methods and apparatuses for correcting for respiratory and cardiac motion in PET images.
- the application subject matter finds use at least with PET imaging, and will be described with particular reference thereto. However, it also has more general application with other imaging methods and in other arts, such as SPECT imaging or CT imaging.
- Motion that occurs during a medical imaging acquisition can be problematic, as it can result in the deterioration of image quality and compromise the clinical utility of the resulting image data.
- Motion artifacts can result from a variety of different kinds of motion, such as for example respiratory motion, cardiac motion, and gross patient motion.
- Respiratory motion is motion caused by the expansion and contraction of the lungs during a respiratory cycle.
- Cardiac motion is motion caused by the expansion and contraction of the heart during a cardiac cycle.
- Gross patient motion is motion caused by voluntary or involuntary muscular movement of body parts, such as the chest, arms or legs. The likelihood that any of these kinds of motion will be problematic can be increased during PET imaging acquisitions, because PET imaging acquisition times are typically lengthy, on the order of minutes or tens of minutes.
- a method and apparatus are provided for detecting motion during a PET imaging acquisition using time of flight information.
- a method and apparatus are provided for detecting and correcting for respiratory motion and cardiac motion in PET images. Still further aspects of the present invention will be appreciated by those of ordinary skill in the art upon reading and understanding the following detailed description. Numerous additional advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of preferred embodiments.
- the invention may take form in various components and arrangements of components, and in various process operations and method steps and arrangements of process operations and method steps.
- the drawings are only for the purpose of illustrating preferred embodiments and are not to be construed as limiting the invention.
- FIGURE 1 is a schematic view of an exemplary PET imaging system
- FIGURE 2 depicts an exemplary method for detecting motion using time of flight information
- FIGURE 3 depicts an exemplary motion characterization and correction method
- FIGURE 4 depicts an exemplary electrocardiogram (ECG) signal recording
- FIGURE 5 is an exemplary respiratory waveform
- FIGURE 6 depicts another exemplary motion characterization and correction method
- FIGURE 7 depicts an exemplary method for combining the results of the methods of FIGURE 3 and FIGURE 6;
- FIGURE 8 depicts an exemplary iterative process for combining the results of the methods of FIGURE 3 and FIGURE 6;
- FIGURE 9 depicts an exemplary curve representing the volume of blood flow over time.
- FIGURE 1 An exemplary time of flight PET imaging system 100 is illustrated schematically in FIGURE 1.
- the time of flight PET imaging system 100 includes a PET imaging scanner 102.
- a patient or an imaged subject 104 is placed within a gantry 106 of the PET imaging scanner 102.
- the gantry 106 of the illustrated embodiment of the PET imaging system 100 contains several photon detectors disposed in a ring 108 around the patient 104 to detect coincident photon pairs emitted by positron - electron annihilations 110. Two such detectors A and B are shown in FIGURE 1.
- a detector ring 108 will typically have several detectors, and also there may typically be many detector rings set side by side.
- FIGURE 1 illustrates a 2- dimensional system. However, the concepts being illustrated apply equally well to a 3- dimensional system.
- the gantry may contain detectors that are not arranged in a ring geometry, such as two opposing plate detectors.
- a radiopharmaceutical is first injected into the subject 104.
- the radiopharmaceutical contains a targeting aspect which interacts with a molecule or process of interest within the patient's body, such as glucose metabolism.
- the radiopharmaceutical also contains a positron-emitting radionuclide. An emitted positron will collide with an electron from a nearby atom, and the positron and the electron annihilate. As a result of the annihilation, two different photons are emitted in
- the photons both travel at the same speed, the speed of light indexed for the medium they are passing through.
- the ring 108 of detectors record these photons, along with PET imaging data associated with the photons such as the time each photon is detected.
- the PET imaging scanner 102 passes the PET imaging data recorded by the ring
- the PET imaging data passes to an image processor 124 which stores the data in a memory 126.
- the image processor 124 electronically processes the PET imaging data to generate images of the imaged patient or other object 104.
- the image processor 124 can show the resulting images on an associated display 128.
- a user input 130 such as a keyboard and/or mouse device may be provided for a user to control the processor 124.
- a given detector such as the detector A, including associated electronics, is able to very accurately identify the time at which it detects a photon. If two detectors such as the detectors A and B in FIGURE 1 each record receipt of a photon within a given coincidence time period, it is assumed that the pair of photons resulted from a positron - electron annihilation event such as 110. In particular, it is assumed that the annihilation 110 occurred somewhere along the straight line connecting the detectors A and B, called the line of response 112 as shown in FIGURE 1. Such pairs of detection events, or
- coincidence events are recorded by the PET imaging scanner 102.
- image reconstruction algorithms executed by the image processor 124, the time of flight PET imaging system 100 can use such coincidence events to determine the distribution of the
- Radiopharmaceutical in the patient that distribution is used to generate a PET image.
- time-of-flight PET imaging a coincidence event is acquired by two detectors such as A and B along with the difference in arrival time of the two coincident photons. Because the two coincident photons both travel at substantially the same speed, the arrival time difference has a direct correlation to the time of flight of the photons from the annihilation point 110 to the coincident detectors A and B. Because of that, the system 120 can approximately calculate the position along the line of response 112 where the annihilation 110 occurred, increasing the resolution of the PET image reconstruction.
- a time of flight module 132 may be employed for this and other purposes. As illustrated in FIGURE 1, the time of flight module 132 is located within the PET imaging, processing and display system 120. In various, additional embodiments the time of flight module 132 may be located within the PET imaging scanner 102 or can be located remotely from the PET imaging system 100. The time of flight module 132 may be implemented in hardware or software.
- One aspect of the present invention is directed generally to a method and apparatus for using time of flight information to detect motion, such as gross patient motion and/or respiratory motion, during an imaging acquisition.
- motion such as gross patient motion and/or respiratory motion
- this aspect of the present invention is particularly useful in a PET imaging acquisition, but in its broader aspects it may be used in connection with other kinds of imaging acquisitions.
- the exemplary method and apparatus provided herein may be used to detect motion during an image acquisition without requiring the use of an external device and without analyzing reconstructed image data, although it may also be used in conjunction with such methods.
- FIGURE 2 An exemplary motion detecting method 200 according to one aspect of the present invention is illustrated in FIGURE 2.
- the exemplary method 200 utilizes time of flight information to detect gross patient motion and/or respiratory motion during an imaging procedure.
- a PET imaging system such as the system 100 is utilized to detect photon coincidence events and collect related imaging data concerning such coincidences, such as the time of flight of the coincidence.
- imaging data may be stored in a list mode 204 in memory 128.
- the list mode 204 contains all of the imaging data from an entire imaging procedure.
- the list mode 204 includes only a portion of the total imaging data corresponding to a time segment of the imaging procedure and the storage of the imaging data may be an ongoing process so that the imaging data from a portion of the imaging procedure is being analyzed and/or manipulated as additional data is being collected (i.e., a transient process).
- the approximate location of the annihilation 110 within the gantry 106 can be determined and added to the data 204. This may be done in pseudo-continuous space, where each photon coincidence event is localized to a bin along a line of response 112, wherein the bins may represent for example 5 mm intervals.
- the time of flight differences from the list mode data 204 are collected or aggregated to generate aggregate time of flight data 208.
- the time of flight data 208 can be generated for the entire imaging procedure or can be separately generated for portions of the imaging procedure in a transient process.
- the aggregated time of flight data 208 is simply a selection from the overall list mode data 204 which is large enough to provide a reliable sample of data.
- the aggregate time of flight data 208 may for example be generated by a time of flight module 132.
- the aggregate time of flight data 208 may be compiled into a histogram.
- the aggregation 208 preferably comprises a few seconds of imaging time, on the order of millions of photon coincidence events, although any time duration which provides a meaningful data sample may be employed.
- the collection 206 and aggregation 208 of the list mode data 204 may be limited to a subset of the data 204 in which motion might be expected. For example, if the imaged subject 104 is disposed within the gantry 106 with his or her back or chest lying on a table, as is typical, then respiratory motion of the subject 104 will chiefly result in vertical motion of the subject's chest within the gantry 106. Thus, the collection 206 and aggregation 208 of the list mode data 204 may be limited to or weighted towards vertical lines of response 112 within the gantry 106.
- Another typical example is where gross patient movement of the patient's arms or legs might be expected during the imaging scan.
- the collection 206 and aggregation 208 of the list mode data 204 may be limited or weighted to annihilation events occurring in extreme horizontal positions within the gantry 106.
- the collection 206 and aggregation 208 of list mode data 204 may be limited to a pair of detectors, such as detectors A and B in FIGURE 1, or a limited set of such paired detectors.
- the data 204 that is collected 206 and aggregated 208 may be limited to a particular line of response 112 or a range of line of response 112 angles, such angles being relative to the ring 108 or axial angles.
- all of the gathered data 204 is collected 206 and aggregated 208.
- step 210 the aggregate time of flight data 208 is analyzed and converted into one or more position metrics 212.
- the position metric 212 reflects the movement of the time of flight data over the course of time, and therefore the movement of the imaged subject 104 over the course of time. This position metric 212 may be displayed to the operator of the PET imaging system 100 in or near real time.
- the aggregate time of flight data 208 is converted into a respiratory wave position metric 214.
- the respiratory wave position metric 214 may, for example, be averaged or reduced to a conventional respiratory wave to help in the detection of respiratory motion.
- an initial identification may be made of the maximum and minimum positions corresponding to the respiratory motion, based on an analysis of the aggregate time of flight data 208. For example, the anterior-posterior motion of the subject within selected imaging slices along the longitudinal or "z" axis of the imaged subject 104 may be monitored for a time period, such as five or ten seconds, to generate a signal related to the amplitude of respiratory motion.
- That amplitude data may then be used to correlate the aggregate time of flight data 208 with a conventional respiratory wave to generate the respiratory wave position metric 214.
- the conventional respiratory wave may be a measured signal or a pre- generated archetype respiratory signal.
- data processing or smoothing may be applied to the respiratory wave position metric 214 for optimization purposes.
- the aggregate time of flight data 208 may be converted into a gross motion position metric 216 in various embodiments.
- the aggregate time of flight data 208 is utilized to obtain information regarding the center of activity.
- the center of annihilation activity may for example be determined for the entire imaged subject 104 or for only a portion of the imaged subject.
- the gross motion position metric 216 may reflect the 2-dimensional center of photon annihilation activity within one or more axial slices of imaging data 204.
- Various combinations are also possible.
- step 218 the consistency of the position metric 212 is monitored, recorded and analyzed over time.
- the position metric is then used to generate a final position metric 220.
- the position metric 212 is a respiratory wave position metric 214, it may be used to generate a final respiratory waveform 220 for use in image reconstruction.
- the final respiratory waveform 220 can be used for respiratory gating or to otherwise classify the time sequence of the list mode data 204. In this manner, respiratory gating can be accomplished using the exemplary method 200 without the need for an external device, such as a bellows transducer or video camera.
- the final respiratory waveform 220 is used to generate gating signals that are inserted into the list mode data 204 to mark the list mode data for time-based respiratory framing.
- the final respiratory waveform 220 is utilized to generate a respiratory amplitude for use in flexible, amplitude based gating. Such respiratory gating may be used, for example, in radiation therapy planning.
- the final respiratory waveform 220 may be analyzed in near real time during the imaging acquisition to determine whether the subject's respiratory motion is exceeding pre-set thresholds and, if it is, then send a warning signal to the imaging operator.
- the position metric is a gross motion position metric, it may be used to generate a final gross motion position metric 220 for use in image reconstruction. If the position metric 212 is a gross motion position metric 216, it can be monitored in step 218 and compared to one or more pre-selected motion threshold(s) to alert the operator of the PET imaging system 100 that gross patient motion has occurred when a pre-selected motion threshold is exceeded. In various embodiments of the exemplary method 200, the warning that is provided to the operator of the PET imaging system 100 is accompanied with information about the region of the imaged subject 104 where gross motion is suspected to have occurred. In yet additional embodiments, information regarding the direction and/or magnitude of the gross motion is provided to the operator.
- gross motion time flags are introduced into the list mode data 204 for use in separating the image data into images prior to the gross motion and images after the gross motion, or other diagnostic purposes.
- respiratory gating has most commonly been performed using respiratory information gathered by external devices, such as a bellows transducer, video cameras, or other equipment to correlate the gated windows to the respiratory cycle.
- the gated windows of the data are then reconstructed to form an image using mathematical algorithms, which provide for spatial registration of the reconstructed images.
- One aspect of the present invention is directed generally to a method and apparatus for correcting respiratory motion and cardiac motion in PET images.
- the exemplary method and apparatus provided herein are useful for characterizing and correcting for both respiratory motion and cardiac motion without the need for an external device to gather information regarding the respiratory motion.
- the method may, however, in some embodiments be used in conjunction with such devices.
- FIGURE 3 An exemplary motion characterization and correction method 300 according to one aspect of the present invention is illustrated in FIGURE 3.
- the exemplary method 300 corrects for both respiratory motion and cardiac motion in cardiac PET images.
- additional embodiments of the method are applicable to other types of PET imaging or other imaging modalities, such as CT or SPECT, or combined imaging modalities, such as PET/CT and SPECT/CT imaging.
- a PET imaging system such as the system 100 is utilized to conduct a cardiac PET acquisition and detect photon coincidences arising from annihilation events 110 occurring in the area of the heart of an imaged subject 104.
- the information from the cardiac PET acquisition is collected in a list mode data 304 that is stored in memory 128.
- step 306 a digitized signal of the heart muscle contractions of the imaged subject
- ECG electrocardiogram
- FIGURE 4 An exemplary electrocardiogram (ECG) recording is set forth in FIGURE 4.
- the horizontal axis of the ECG represents the passage of time, while the plotted curve reflects the electrical activity and therefore the phase of the subject's heart at any given point in time along that axis.
- the ECG indicates the starting and stopping times of the cardiac cycle, as the heart repeatedly expands and contracts.
- step 310 the time-stamp of the electrocardiogram (ECG) 308 is synchronized with the acquired list mode data 304 to generate cardiac gated list mode data 312. More specifically, the acquired list mode data 304 is taken over the course of several heartbeats or cardiac cycles. The list mode data 304 contains the time at which each photon coincidence was detected. The electrocardiogram (ECG) 308, in turn, provides the phase of the patient's heart within the cardiac cycle at that particular point in time. The cardiac cycle may, as one example, be subdivided into a series of cardiac gated windows such as 8 to 16 windows.
- each detected photon coincidence in the list mode data 304 is identified with a cardiac gated window of the cardiac cycle.
- the cardiac gated list mode data 312 for a particular gated window will have data corresponding only to the phase of the heart for that window, but taken during different repetitions of the cardiac cycle over the entire imaging acquisition time.
- step 314 the cardiac gated list mode data 312 in each of the cardiac gated windows is separately reconstructed to form an image of the heart. This produces a series of cardiac gated cardiac images 316, with each image corresponding to one of the gated windows of the entire cardiac cycle. So if the cardiac cycle was divided into 16 gated windows, then there will be 16 gated cardiac images 316.
- a typical PET acquisition usually lasts an extended period of time on the order of minutes or tens of minutes, it will contain data spanning multiple respiratory cycles. For example, if a particular cardiac PET acquisition lasts 5 minutes, and the length of time of an average respiratory cycle is around 4 seconds (as an example), the imaged subject 104 will go through about 75 respiratory cycles during the PET acquisition.
- An exemplary respiratory wave or respiratory cycle is depicted in FIGURE 5.
- Each of the cardiac gated cardiac images 316 is blurred due to the motion of the heart caused by these respiratory cycles.
- the list mode data 304 photon coincidences are spread out in space due to the movement of the heart caused by the contraction and expansion of the subject's lungs.
- That respiratory movement can cause the imaged subject's heart to move up, down, left, right, forward, backward, or even be rotated through a torsional motion during the PET image acquisition. Accordingly, the blurring and motion artifacts introduced into the cardiac gated cardiac images 316 due to respiratory motion may advantageously be removed.
- the number of cardiac gated windows for the cardiac cycle is chosen so that, within a particular gated window, one would expect very little or no movement of the heart. As a consequence, any movement of the annihilation data within each gated window most likely results from respiratory movement, not cardiac movement. In this way, respiratory motion of the heart can be isolated from cardiac motion, and may be approximated by movement of the heart within a particular cardiac gated window.
- each one of the cardiac gated windows defining the cardiac gated images 316 is further divided into sub-intervals of time.
- the list mode data 304 is assigned to the appropriate sub-interval within the cardiac gated window, again using the electrocardiogram (ECG) 308 as a guide.
- ECG electrocardiogram
- the list mode data 304 assigned to each of those sub-intervals represents a respiratory gated cardiac image 320. Any number of time sub-intervals may be used for this purpose.
- step 322 the center of activity is calculated for each of the respiratory gated cardiac images 320. As discussed above, any movement of that center of activity represents respiratory movement which has been isolated from cardiac movement.
- any difference(s) in the center of activities of the respiratory gated cardiac images within the same cardiac gated window is indicative of respiratory movement, not cardiac movement. In that way any such difference(s) may be used to generate respiratory motion vectors 324 which may be used to corrected the list mode data 204 for respiratory motion in image reconstruction.
- a PET imaging system such as the system 100 is utilized to conduct a cardiac PET acquisition and detect photon coincidences arising from annihilation events 110 occurring in the area of the heart of an imaged subject 104.
- the information from the cardiac PET acquisition 302 is collected in a list mode data 304 that is stored in memory 128.
- a digitized signal of the heart muscle contractions of the imaged subject 104 is acquired concurrently with the cardiac PET acquisition 302 in the form of an electrocardiogram (ECG) 308.
- ECG electrocardiogram
- An exemplary electrocardiogram (ECG) recording is set forth in FIGURE 4.
- the horizontal axis of the ECG represents the passage of time, while the plotted curve reflects the electrical activity and therefore the phase of the subject's heart at any given point in time along that axis.
- the ECG indicates the starting and stopping times of the cardiac cycle, as the heart repeatedly expands and contracts.
- step 610 the time-stamp of the electrocardiogram (ECG) 308 is synchronized with the acquired list mode data 304 to generate cardiac cycle data 612. More specifically, the acquired list mode data 304 is taken over the course of several heartbeats or cardiac cycles. The list mode data 304 contains the time at which each photon coincidence was detected. The electrocardiogram (ECG) 308, in turn, identifies which of the heart beats— the first cycle, the second cycle, etc.— corresponds to that particular point in time. There may have been, as one example, 300 repetitions of the cardiac cycle during a five-minute imaging acquisition time.
- each detected photon coincidence in the list mode data 304 is identified with a particular repetition of the cardiac cycle during the image acquisition period.
- the cardiac cycle data 612 for a particular repetition of the cardiac cycle contains data corresponding only to that repetition of the cardiac cycle.
- step 614 the cardiac cycle data 612 for each of the cardiac cycles is separately reconstructed to form an image of the heart. This produces a series of cardiac cycle images, with each image corresponding to one entire cardiac cycle.
- step 618 the center of annihilation activity for each of the cardiac cycle images
- cardiac motion affects the position of the center of annihilation in a negligible way, because each image 616 corresponds to an entire cardiac cycle.
- any movement of the center of photon annihilation activity in these images, from image to image results from respiratory movement and not cardiac movement.
- respiratory motion of the heart can be isolated from cardiac motion.
- Any difference(s) in the center of annihilation activity between cardiac cycle images 616 may be used to generate respiratory motion vectors 620 for use in correcting for respiratory motion in image reconstruction.
- Each cardiac cycle image 616 contains a sufficient number of photon coincidence counts to calculate the center of activity for the particular cardiac cycle represented by the image 616.
- Both exemplary motion correction methods 300 and 600 estimate the same respiratory motion, occurring during the same PET acquisition, but they do so in different ways. Accordingly, respiratory motion vectors 324 obtained from the first exemplary method 300 and respiratory motion vectors 620 obtained from the second exemplary method 600 both represent estimates for the same respiratory motion.
- either exemplary motion correction method 300 or 600 may be individually used to correct for the respiratory motion.
- both respiratory motion vectors 324 and respiratory motion vectors 620 may be combined such as by averaging to generate combined respiratory motion vectors.
- FIGURE 7 an exemplary method 800 for combining the respiratory motion estimates from exemplary method 300 and exemplary method 600 is illustrated.
- the respiratory motion vectors 324 are combined with the respiratory motion vectors 620 to generate combined respiratory motion vectors 704. It is desirable to appropriately combine respiratory motion vectors 324 with respiratory motion vectors 620 to generate the most probable respiratory estimation.
- Respiratory motion vectors 324 can be combined with respiratory motion vectors 620 in a variety of different ways in various embodiments of method 700. For example, a weighted least square method may be utilized. Various factors may be utilized in the weighted least square method. For example, the weighted least square method may be based on a relaxation factor, number of total photon coincidence counts, amplitude of respiration, regularity of respiratory motion, noise, signal to noise ratio, regularity of cycle from the electrocardiogram, or other suitable variables.
- the combined respiratory motion vectors 704 are used to reconstruct the PET cardiac acquisition data to generate a respiratory motion free cardiac image 708.
- an iterative process may be used to combine the respiratory motion vectors 324, 620.
- an exemplary iterative method 800 for combining the respiratory motion vectors 324, 620 is depicted.
- one of the sets of respiratory motion vectors 324, 620 obtained by method 300 or 600 is utilized by itself to correct for the respiratory motion in step 802.
- the PET acquisition data is reconstructed to obtain a first motion corrected reconstructed PET image 806.
- respiratory motion is then estimated using the motion estimation method 300 or 600 that was not previously used in step 802.
- step 810 the motion corrected PET acquisition data is reconstructed to obtain a second motion corrected reconstructed PET image 812.
- the steps of iterative method 800 are then repeated, using either motion correction method 300 or 600 for each iteration of method 800 as necessary to correct for respiratory motion.
- cardiac motion can then be estimated and removed.
- this method is applied to the whole left ventricle. In various additional embodiments, this method is applied independently to each of the 17 segments of the heart. In various additional embodiments, this method is applied to each pixel of the left ventricle of the heart.
- Various additional spatial decompositions of the heart could also be utilized in additional embodiments.
- the major portion of the motion left in the cardiac PET image is due to cardiac motion. If the heart is processed as a whole, the residual motion is the same for the whole heart. If the heart is analyzed as 17 segments, the residual motions are different from segment to segment. However, it should be understood that the granularity of the spatial decomposition of the heart depends on the count density per region.
- the respiratory motion is first estimated, then removed from the original PET cardiac data. Then, the cardiac motion is estimated and removed from the respiratory motion corrected PET cardiac data.
- the estimation and removal of respiratory motion and cardiac motion from the PET data is carried out by an iterative process by simply repeating estimation and removal of the respiratory motion and the estimation and removal of the cardiac motion. In this manner, the cross-contamination of the respiratory motion estimates and cardiac motion estimates will be minimized.
- the respiratory and cardiac motion corrections are performed in image space.
- the respiratory and/or cardiac motion corrections are introduced at the level of the lines of response 112.
- the original list mode data is modified using the respiratory and/or cardiac motion estimations and then a reconstruction is performed.
- the respiratory motion and cardiac motion are estimated for the entire cardiac region.
- the estimation of respiratory motion and cardiac motion is focused solely on the left ventricle.
- separate estimations are made independently for one or more of each of the 17 segments of the heart.
- the methodology used for estimating respiratory and cardiac motion is applied to each pixel of the left ventricle, or each pixel of another particular region of the heart.
- the previously described motion characterization and correction methods have a variety of uses and benefits. For example, the methods described herein are helpful in sharpening a gated PET image and improving the reconstruction of the gated PET images. In addition, the motion correction methods described herein may be used to improve a dynamic PET reconstruction.
- One additional potential use for the motion correction methods described herein is to determine the ejection fraction of the heart.
- An exemplary curve 900 representing the volume of blood flow over time is set forth in FIGURE 9.
- the ejection fraction is a commonly recognized measurement of the health of the heart, and it may be calculated as follows. The volume of blood within a ventricle immediately before a contraction is known as the end-diastolic volume.
- the volume of blood within the ventricle at the end of contraction is known as the end-systolic volume.
- the difference between the end-diastolic and end-systolic volumes is the stroke volume, or the volume of blood ejected from the ventricle with each beat.
- the ejection fraction is the fraction of the end-diastolic volume that is ejected with each beat; that is, it is stroke volume (SV) divided by end-diastolic volume (EDV), as follows:
- the motion corrections described herein can also be used to obtain improved absolute blood flow measurements or to perform other functions based on underlying PET imaging data.
- Logic includes but is not limited to hardware, firmware, software and/or combinations of each to perform a function(s) or an action(s), and/or to cause a function or action from another component.
- logic may include a software controlled microprocessor, discrete logic such as an application specific integrated circuit (ASIC), or other programmed logic device.
- ASIC application specific integrated circuit
- Logic may also be fully embodied as software.
- Software includes but is not limited to one or more computer readable and/or executable instructions that cause a computer or other electronic device to perform functions, actions, and/or behave in a desired manner.
- the instructions may be embodied in various forms such as routines, algorithms, modules or programs including separate applications or code from dynamically linked libraries.
- Software may also be implemented in various forms such as a stand-alone program, a function call, a servlet, an applet, instructions stored in a memory, part of an operating system or other type of executable instructions. It will be appreciated by one of ordinary skill in the art that the form of software is dependent on, for example, requirements of a desired application, the environment it runs on, and/or the desires of a designer/programmer or the like.
- the systems and methods described herein can be implemented on a variety of platforms including, for example, networked control systems and stand-alone control systems. Additionally, the logic, databases or tables shown and described herein preferably reside in or on a computer readable medium. Examples of different computer readable media include Flash Memory, Read-Only Memory (ROM), Random-Access Memory (RAM), programmable read-only memory (PROM), electrically programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), magnetic disk or tape, optically readable mediums including CD-ROM and DVD-ROM, and others. Still further, the processes and logic described herein can be merged into one large process flow or divided into many sub-process flows. The order in which the process flows herein have been described is not critical and can be rearranged while still accomplishing the same results. Indeed, the process flows described herein may be rearranged, consolidated, and/or re-organized in their implementation as warranted or desired.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Pathology (AREA)
- High Energy & Nuclear Physics (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Physics & Mathematics (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Optics & Photonics (AREA)
- Radiology & Medical Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physiology (AREA)
- Pulmonology (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Nuclear Medicine (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
RU2012128838/14A RU2554378C2 (en) | 2009-12-10 | 2010-11-17 | Method and device for application of time-of-flight information for detection and introduction of corrections for movement in scanograms |
US13/509,656 US8824757B2 (en) | 2009-12-10 | 2010-11-17 | Method and apparatus for using time of flight information to detect and correct for motion in imaging scans |
EP10805315.8A EP2509505B1 (en) | 2009-12-10 | 2010-11-17 | Method and apparatus for using time of flight information to detect and correct for motion in imaging scans |
CN201080055480.1A CN102781331B (en) | 2009-12-10 | 2010-11-17 | Flight-time information is used to detect the method and apparatus with the motion in correcting imaging scanning |
JP2012542645A JP6243121B2 (en) | 2009-12-10 | 2010-11-17 | Method and apparatus for motion detection and correction in imaging scans using time-of-flight information |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US28520509P | 2009-12-10 | 2009-12-10 | |
US61/285,205 | 2009-12-10 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2011070465A2 true WO2011070465A2 (en) | 2011-06-16 |
WO2011070465A3 WO2011070465A3 (en) | 2012-01-12 |
Family
ID=44145982
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2010/055248 WO2011070465A2 (en) | 2009-12-10 | 2010-11-17 | Method and apparatus for using time of flight information to detect and correct for motion in imaging scans |
Country Status (6)
Country | Link |
---|---|
US (1) | US8824757B2 (en) |
EP (1) | EP2509505B1 (en) |
JP (1) | JP6243121B2 (en) |
CN (1) | CN102781331B (en) |
RU (1) | RU2554378C2 (en) |
WO (1) | WO2011070465A2 (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012176114A1 (en) * | 2011-06-21 | 2012-12-27 | Koninklijke Philips Electronics N.V. | Respiratory motion determination apparatus |
CN103126701A (en) * | 2011-11-30 | 2013-06-05 | 株式会社东芝 | Positron emission computed tomography apparatus and image processing apparatus |
JP2014098583A (en) * | 2012-11-13 | 2014-05-29 | Toshiba Corp | Nuclear medicine diagnosis device and image processing program |
CN104144649A (en) * | 2011-10-25 | 2014-11-12 | 皮拉莫尔影像股份公司 | Method for producing optimised tomography images |
WO2014194412A1 (en) * | 2013-06-07 | 2014-12-11 | Bienenstock Elazar A | Single photon emission computed tomography imaging method |
US9173625B2 (en) | 2012-04-30 | 2015-11-03 | Elazar A. Bienenstock | Single photon emission computed tomography imaging method |
US10448901B2 (en) | 2011-10-12 | 2019-10-22 | The Johns Hopkins University | Methods for evaluating regional cardiac function and dyssynchrony from a dynamic imaging modality using endocardial motion |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103282941B (en) * | 2011-01-05 | 2016-09-21 | 皇家飞利浦电子股份有限公司 | The method and apparatus of the motion utilizing gate-control signal to detect and to correct in list mode PET data |
US8913710B2 (en) | 2011-04-27 | 2014-12-16 | Varian Medical Systems, Inc. | Truncation correction imaging enhancement method and system |
US8897527B2 (en) * | 2011-06-07 | 2014-11-25 | Varian Medical Systems, Inc. | Motion-blurred imaging enhancement method and system |
US8903150B2 (en) | 2011-07-31 | 2014-12-02 | Varian Medical Systems, Inc. | Filtration imaging enhancement method and system |
CN103381095A (en) * | 2012-05-03 | 2013-11-06 | 三星电子株式会社 | Apparatus and method for generating image in positron emission tomography |
WO2014022104A1 (en) * | 2012-07-31 | 2014-02-06 | General Electric Company | Method and system for determination of geometric features in objects |
US9078622B2 (en) * | 2013-03-13 | 2015-07-14 | General Electric Company | Method and apparatus for data selection for positron emission tomogrpahy (PET) image reconstruction |
US9398855B2 (en) | 2013-05-30 | 2016-07-26 | Siemens Aktiengesellschaft | System and method for magnetic resonance imaging based respiratory motion correction for PET/MRI |
US9375184B2 (en) | 2013-09-12 | 2016-06-28 | Technische Universität München | System and method for prediction of respiratory motion from 3D thoracic images |
CN106413533B (en) * | 2014-06-06 | 2020-12-22 | 皇家飞利浦有限公司 | Apparatus, system and method for detecting apnea of a subject |
CN107133549B (en) | 2016-02-29 | 2020-11-24 | 上海联影医疗科技有限公司 | ECT motion gating signal acquisition method and ECT image reconstruction method |
CN106251380B (en) * | 2016-07-29 | 2022-07-15 | 上海联影医疗科技股份有限公司 | Image reconstruction method |
WO2018172566A1 (en) * | 2017-03-24 | 2018-09-27 | Koninklijke Philips N.V. | Noise-robust real-time extraction of the respiratory motion signal from pet list-data |
EP3612098B1 (en) * | 2017-04-21 | 2023-01-04 | Koninklijke Philips N.V. | Respiratory gating using pulse oximeters for tomographic imaging |
US10282871B2 (en) | 2017-07-10 | 2019-05-07 | Shanghai United Imaging Healthcare Co., Ltd. | Systems and methods for pet image reconstruction |
CN107481226B (en) * | 2017-07-27 | 2021-06-01 | 东软医疗系统股份有限公司 | Method and device for removing abnormal scanning data and PET system |
US10410383B2 (en) * | 2017-08-26 | 2019-09-10 | Uih America, Inc. | System and method for image data processing in positron emission tomography |
CN108634974B (en) * | 2018-04-03 | 2022-03-04 | 东软医疗系统股份有限公司 | Gate control signal determining method and device |
EP4270413A3 (en) * | 2018-04-05 | 2023-12-27 | Siemens Medical Solutions USA, Inc. | Motion signal derived from imaging data |
FR3094889B1 (en) * | 2019-04-12 | 2022-08-19 | Quantum Surgical | Device and method for monitoring patient breathing for a medical robot |
CN110215203B (en) * | 2019-05-28 | 2021-10-22 | 上海联影医疗科技股份有限公司 | Electrocardiosignal acquisition method and device, computer equipment and storage medium |
CN110215226B (en) * | 2019-05-28 | 2023-10-03 | 上海联影医疗科技股份有限公司 | Image attenuation correction method, image attenuation correction device, computer equipment and storage medium |
JP7330833B2 (en) * | 2019-09-20 | 2023-08-22 | 株式会社日立製作所 | Radiation imaging device and radiotherapy device |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6937696B1 (en) * | 1998-10-23 | 2005-08-30 | Varian Medical Systems Technologies, Inc. | Method and system for predictive physiological gating |
US7844317B2 (en) * | 2003-11-26 | 2010-11-30 | General Electric Company | Method and system for estimating three-dimensional respiratory motion |
US8285359B2 (en) * | 2003-11-26 | 2012-10-09 | General Electric Company | Method and system for retrospective gating using multiple inputs |
JP4467987B2 (en) * | 2004-01-05 | 2010-05-26 | 株式会社東芝 | Nuclear medicine diagnostic equipment |
JP2007107995A (en) * | 2005-10-13 | 2007-04-26 | Toshiba Corp | Nuclear medicine imaging device and image data generation method |
DE102005049862A1 (en) * | 2005-10-18 | 2007-04-26 | Siemens Ag | Movement correction method for use during imaging heart, involves combining recorded pictures with one another to generate combined image data record, where calculated variation is taken into account when combining pictures |
US8144962B2 (en) * | 2006-02-28 | 2012-03-27 | Koninklijke Philips Electronics N.V. | Local motion compensation based on list mode data |
RU2317771C2 (en) * | 2006-04-03 | 2008-02-27 | Институт физиологии природных адаптаций Уральского отделения Российской академии наук | Method for correcting vegetative misbalance states with varicard complex for processing cardiointervalograms and analyzing cardiac rhythm variability, operating under computer software program with biofeedback |
WO2008096285A2 (en) * | 2007-02-07 | 2008-08-14 | Koninklijke Philips Electronics, N.V. | Motion estimation in treatment planning |
CN101681520B (en) * | 2007-05-30 | 2013-09-25 | 皇家飞利浦电子股份有限公司 | PET local tomography |
US8107695B2 (en) * | 2007-06-27 | 2012-01-31 | General Electric Company | Methods and systems for assessing patient movement in diagnostic imaging |
DE102007034953B4 (en) * | 2007-07-26 | 2016-09-22 | Siemens Healthcare Gmbh | A method for recording movements of a patient and associated medical device |
US8351671B2 (en) | 2007-07-26 | 2013-01-08 | Koninklijke Philips Electronics N.V. | Motion correction in nuclear imaging |
JP5558672B2 (en) * | 2008-03-19 | 2014-07-23 | 株式会社東芝 | Image processing apparatus and X-ray computed tomography apparatus |
JP2009236793A (en) * | 2008-03-28 | 2009-10-15 | Hitachi Ltd | Method for creating image information, method for creating tomographic image information for tomographic photographing apparatus, and tomographic photographing apparatus |
-
2010
- 2010-11-17 RU RU2012128838/14A patent/RU2554378C2/en not_active IP Right Cessation
- 2010-11-17 JP JP2012542645A patent/JP6243121B2/en active Active
- 2010-11-17 US US13/509,656 patent/US8824757B2/en active Active
- 2010-11-17 CN CN201080055480.1A patent/CN102781331B/en active Active
- 2010-11-17 EP EP10805315.8A patent/EP2509505B1/en active Active
- 2010-11-17 WO PCT/IB2010/055248 patent/WO2011070465A2/en active Application Filing
Non-Patent Citations (1)
Title |
---|
None |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012176114A1 (en) * | 2011-06-21 | 2012-12-27 | Koninklijke Philips Electronics N.V. | Respiratory motion determination apparatus |
CN103608845A (en) * | 2011-06-21 | 2014-02-26 | 皇家飞利浦有限公司 | Respiratory motion determination apparatus |
US9414773B2 (en) | 2011-06-21 | 2016-08-16 | Koninklijke Philips N.V. | Respiratory motion determination apparatus |
US10448901B2 (en) | 2011-10-12 | 2019-10-22 | The Johns Hopkins University | Methods for evaluating regional cardiac function and dyssynchrony from a dynamic imaging modality using endocardial motion |
CN104144649A (en) * | 2011-10-25 | 2014-11-12 | 皮拉莫尔影像股份公司 | Method for producing optimised tomography images |
CN103126701A (en) * | 2011-11-30 | 2013-06-05 | 株式会社东芝 | Positron emission computed tomography apparatus and image processing apparatus |
JP2013137303A (en) * | 2011-11-30 | 2013-07-11 | Toshiba Corp | Positron emission computed tomography apparatus and image processing apparatus |
US9173625B2 (en) | 2012-04-30 | 2015-11-03 | Elazar A. Bienenstock | Single photon emission computed tomography imaging method |
US9517037B2 (en) | 2012-04-30 | 2016-12-13 | Elazar A. Bienenstock | Single photon emission computed tomography imaging method |
JP2014098583A (en) * | 2012-11-13 | 2014-05-29 | Toshiba Corp | Nuclear medicine diagnosis device and image processing program |
WO2014194412A1 (en) * | 2013-06-07 | 2014-12-11 | Bienenstock Elazar A | Single photon emission computed tomography imaging method |
US10463335B2 (en) | 2013-06-07 | 2019-11-05 | Elazar A. Bienenstock | Single photon emission computed tomography imaging method |
Also Published As
Publication number | Publication date |
---|---|
US20120275657A1 (en) | 2012-11-01 |
CN102781331B (en) | 2015-08-05 |
CN102781331A (en) | 2012-11-14 |
WO2011070465A3 (en) | 2012-01-12 |
JP6243121B2 (en) | 2017-12-06 |
RU2554378C2 (en) | 2015-06-27 |
JP2013513792A (en) | 2013-04-22 |
EP2509505B1 (en) | 2019-07-31 |
EP2509505A2 (en) | 2012-10-17 |
RU2012128838A (en) | 2014-01-20 |
US8824757B2 (en) | 2014-09-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8824757B2 (en) | Method and apparatus for using time of flight information to detect and correct for motion in imaging scans | |
CN1846618B (en) | Method for mathematical compensation of a periodic movement of an organ and image shooting system | |
US9414773B2 (en) | Respiratory motion determination apparatus | |
US20190133542A1 (en) | Systems and methods for data-driven respiratory gating in positron emission tomography | |
JP2007167656A (en) | Method for analysis of motion of subject, and tomographic device | |
US20190274569A1 (en) | Method for determining diastasis timing using an mri septal scout | |
JP2013513792A5 (en) | ||
US20150133803A1 (en) | Noninvasive electrocardiographic method for estimating mammalian cardiac chamber size and mechanical function | |
US10736594B2 (en) | Data-based scan gating | |
US11471065B2 (en) | Medical image diagnosis apparatus | |
US11269036B2 (en) | System and method for phase unwrapping for automatic cine DENSE strain analysis using phase predictions and region growing | |
US9066707B2 (en) | Heart location and verification in emission images | |
CN110736948A (en) | System and method for generating ECG reference data for MR imaging triggering | |
EP1644900B1 (en) | Non-invasive quantitative myocardial perfusion assessment | |
US10548494B2 (en) | Method for determining a personalized cardiac model using a magnetic resonance imaging sequence | |
US7548777B2 (en) | Computerized method for predicting the diastolic rest period in a cardiac cycle | |
CN115844435A (en) | Perfusion imaging | |
US10098605B2 (en) | Synchronous physiological measurements for cardiac acquisitions | |
US8867810B2 (en) | Automatic identification of disruptive events in imaging scans | |
Herraiz et al. | Automatic cardiac self-gating of small-animal PET data | |
CN114867414A (en) | Apparatus, system, method and computer program for providing a nuclear image of a region of interest of a patient | |
EP3939494A1 (en) | Synchronisation system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080055480.1 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010805315 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13509656 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 4403/CHENP/2012 Country of ref document: IN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012542645 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012128838 Country of ref document: RU |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10805315 Country of ref document: EP Kind code of ref document: A2 |