WO2014155232A1 - Method for improved surface tracking-based motion management and dynamic planning in adaptive external beam radiation therapy - Google Patents
Method for improved surface tracking-based motion management and dynamic planning in adaptive external beam radiation therapy Download PDFInfo
- Publication number
- WO2014155232A1 WO2014155232A1 PCT/IB2014/059884 IB2014059884W WO2014155232A1 WO 2014155232 A1 WO2014155232 A1 WO 2014155232A1 IB 2014059884 W IB2014059884 W IB 2014059884W WO 2014155232 A1 WO2014155232 A1 WO 2014155232A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- therapy
- treatment plan
- target
- tracking data
- planning image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/103—Treatment planning systems
- A61N5/1037—Treatment planning systems taking into account the movement of the target, e.g. 4D-image based planning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/103—Treatment planning systems
- A61N5/1039—Treatment planning systems using functional images, e.g. PET or MRI
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/1048—Monitoring, verifying, controlling systems and methods
- A61N5/1049—Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/1048—Monitoring, verifying, controlling systems and methods
- A61N5/1064—Monitoring, verifying, controlling systems and methods for adjusting radiation treatment in response to monitoring
- A61N5/1068—Gating the beam as a function of a physiological signal
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/1048—Monitoring, verifying, controlling systems and methods
- A61N5/1064—Monitoring, verifying, controlling systems and methods for adjusting radiation treatment in response to monitoring
- A61N5/1069—Target adjustment, e.g. moving the patient support
- A61N5/107—Target adjustment, e.g. moving the patient support in real time, i.e. during treatment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5258—Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise
- A61B6/5264—Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise due to motion
Definitions
- the present application relates generally to external beam radiation therapy (EBRT). It finds particular application in conjunction with surface tracking-based motion management and dynamic planning, and will be described with particular reference thereto. However, it is to be understood that it also finds application in other usage scenarios and is not necessarily limited to the aforementioned application.
- EBRT external beam radiation therapy
- EBRT is delivered over multiple fractions spread over many weeks, but is usually designed based on a single static treatment plan. Failure to take in to account intrafraction motion (e.g., respiratory motion, cardiac motion, etc.) and interfraction motion (e.g., tumor shrinkage due to progressive radiation damage) can result in incomplete dose coverage of targets (e.g., tumors) and damage to surrounding normal tissue, which can include organs at risk (OARs).
- targets e.g., tumors
- OARs organs at risk
- margins are usually added around the target. These margins are usually static and large enough to cover the entire range of motion of the target. However, margins can increase damage to surrounding normal tissue, especially during certain phases of the respiratory cycle. Tumors can shrink during the course of fractions, thus, exacerbating the damage to surrounding normal tissue and bringing it into the "hot spots" of treatment beams, if the treatment beams are not appropriately adjusted.
- PTV margins are dynamically adjusted by changing the PTV contours based on gathered real-time motion data and target motion prediction models. For example, the path of the target is followed by changing the positions of multi-leaf collimator (MLC) leaves on a linear particle accelerator (LINAC).
- Delivery gating utilizes gathered real-time motion data of the target captured during treatment delivery to determine whether treatment beams need to be turned on or off.
- Approaches to quantifying real-time motion include wireless electromagnetic (EM) transponders, on-board imaging (e.g., magnetic resonance imaging (MRI) and ultrasound (US)), body surface tracking, etc.
- Body surface tracking e.g., using high speed cameras
- To make use of surface motion data however, the correspondence between body surface motion and internal target motion needs to be known accurately.
- Body surface tracking has previously been proposed in combination with anatomical data from a pre-procedural four-dimensional (4D) computed tomography (CT) image to predict target motion patterns.
- CT computed tomography
- the correspondence between external body shape and internal target shape is defined for each phase of the respiratory cycle. Therefore, given a particular body surface shape detected at any time during treatment delivery, the internal target shape and position can be predicted.
- the accuracy of the predication relies upon repeatability in the manifestation of breathing on internal organ motion.
- the prediction can be used to implement delivery gating schemes or to implement dynamic PTV margins, whereby surrounding normal tissue and OARs are at less risk to radiation exposure.
- Challenges with the foregoing approach to predicting target motion patterns include the foregoing approach's inability to quantify patient-specific tumor shape and size changes that occur during the course of radiation.
- Tumor shrinkage is a common occurrence during radiation therapy, especially during the middle-to-latter fractions.
- radiation therapy is often used to shrink tumors prior to surgical removal.
- respiratory pattern that vary the correlation between external body shape and internal target shape over time. Registration errors will then accumulate during treatment, resulting in incorrect prediction of target motion for treatment delivery. Therefore, designing delivery gating schemes and dynamic treatment plans based on the original size and shape of the tumor is not only inefficient but also leads to harmful dosing of surrounding normal tissue.
- the present application provides a new and improved system and method which overcome the above-referenced problems and others.
- a therapy system for treating an internal target of a patient includes at least one processor programmed to receive a treatment plan to treat the internal target.
- the treatment plan includes a plurality of treatment fractions including correspondences between the internal target and an external body surface based on a pre-procedural planning image and pre-procedural tracking data.
- the at least one processor is further programmed to, before selected treatment fractions of the plurality of treatment fractions, receive a pre-fraction planning image of the target, receive tracking data of the external body surface of the patient, and update the correspondences between the internal target and the external body surface based on the received pre-fraction planning image and the received tracking data. That at least one processor is also programmed to provide the treatment plan and the updated correspondences to a therapy delivery system configured to deliver therapy to the patient in accordance with the treatment plan and using the updated correspondences.
- a therapy method for treating an internal target of a patient is provided.
- a treatment plan to treat the internal target is received.
- the treatment plan includes a plurality of treatment fractions including correspondences between the internal target and an external body surface based on a pre-procedural planning image and pre-procedural tracking data.
- a pre-fraction planning image of the target is received, tracking data of the external body surface of the patient is received, and the correspondences between the internal target and the external body surface are updated based on the received pre-fraction planning image and the received tracking data.
- the treatment plan and the updated correspondences are provided to a therapy delivery system configured to deliver therapy to the patient in accordance with the treatment plan and using the updated correspondences.
- a therapy delivery system for treating an internal target of a patient.
- the therapy system includes a planning system configured to generate a treatment plan to treat the internal target.
- the treatment plan includes a plurality of treatment fractions.
- the therapy system further includes a synchronization module configured to, before each of one or more treatment fractions selected from the plurality of treatment fractions, update a size and shape of the internal target and update the treatment plan for the updated target size and shape.
- the planning system is configured to provide the updated treatment plan to a delivery control system configured to deliver therapy to the patient in accordance with the updated treatment plan.
- One advantage resides in taking in to account tumor shrinkage when predicting target motion patterns.
- Another advantage resides in reduced dose to normal tissue, which can include organs at risk (OARs) surrounding targets.
- OARs organs at risk
- Another advantage resides in more complete dosing of targets.
- Another advantage resides in improved tracking of target and OAR motion.
- Another advantage resides in reduced planning target volume (PTV) margins. Another advantage resides in improved delivery gating.
- PTV planning target volume
- the invention may take form in various components and arrangements of components, and in various steps and arrangements of steps.
- the drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention.
- FIGURE 1 illustrates a therapy system using body surface tracking for improved therapy delivery.
- FIGURE 2 illustrates an example of the workflow of the therapy system of
- FIGURE 1 A first figure.
- FIGURE 3 illustrates another, more detailed example of the workflow of the therapy system of FIGURE 1.
- a therapy system 10 (e.g., an external beam radiation therapy (EBRT) system) includes a pre-procedural imaging system 12, which generates a pre-procedural planning image 14 of a target of a patient 16, and normal tissue of the patient 16 surrounding the target.
- the surrounding normal tissue commonly includes one or more organs at risk (OARs), and the target is commonly a lesion, such as a tumor, to be treated.
- the pre-procedural planning image 14 can include the external body surface of the patient 16 adjacent the target. For example, where the target is a tumor in a lung of the patient 16, the external body surface is the external chest surface of the patient 16.
- the pre-procedural planning image 14 is typically four-dimensional (4D), whereby it typically includes a three-dimensional (3D) image for each of a plurality of time points. This is to be contrasted with a 3D image which only includes a single time point.
- the plurality of time points can, for example, correspond to the phases of a motion cycle of the patient 16 (e.g., the respiratory cycle of the patient) and suitably span multiple motion cycles.
- the pre-procedural planning image 14 can be generated using any imaging modality but is suitably generated using one of computed tomography (CT), positron emission tomography (PET), magnetic resonance (MR), single photon emission computed tomography (SPECT), ultrasound (US), cone-beam computed tomography (CBCT), and the like.
- CT computed tomography
- PET positron emission tomography
- MR magnetic resonance
- SPECT single photon emission computed tomography
- US cone-beam computed tomography
- CBCT cone-beam computed tomography
- the pre-procedural tracking data 20 includes a sample measurement of the external body surface (e.g., size, shape, contours, etc.) of the patient 16 at time points of the pre-procedural planning image 14.
- the pre-procedural tracking system 18 uses one or more high speed cameras to generate the pre-procedural tracking data 20.
- other approaches to generating the pre-procedural tracking data 20 are also contemplated.
- a planning system 22 of the therapy system 10 receives the pre-procedural planning image 14 (e.g., a 4D CT image) from the pre-procedural imaging system 12 and the pre-procedural tracking data 20 (e.g., a 4D image of the external body surface of the patient 16) from the pre-procedural tracking system 18.
- the pre-procedural planning image 14 and the pre-procedural tracking data 18 are applied to a plurality of modules of the planning system 22, including a segmentation module 24, a user interface module 26, a synchronization module 28, a registration module 30, a body surface tracking module 32, and an optimization module 34, to generate a treatment plan 36, and correspondences 38 between the external body surface and the target, for the patient 16.
- the segmentation module 24 receives an image (e.g., the pre-procedural planning image 14) and delineates one or more regions of interest (ROIs) in the image, such as a target and/or OARs.
- ROIs regions of interest
- the ROIs are typically delineated with contours tracing the boundaries of the ROIs in the image. Delineation can be performed automatically and/or manually. As to automatic delineation, any number of known segmentation algorithms can be employed.
- the segmentation module 24 cooperates with the user interface module 26 to allow clinicians to manually delineate the regions in the image and/or manually adjust automatic delineations of the regions in the image.
- the user interface module 26 presents a user interface to an associated user with a user output device 40 (e.g., a display) of the planning system 22.
- the user interface can allow the user to at least one of delineate ROIs in an image, modify delineations of ROIs in an image, and view delineations of ROIs in an image.
- the user delineates a ROI of an image by drawing a contour along the boundary of the ROI on the image using a user input device 42 (e.g., a computer mouse) of the planning system 22.
- a delineation of a ROI in an image is typically viewed by overlaying a representative contour on the image and modified by resizing and/or reshaping the representative contour using the user input device 42.
- the user interface can also allow the user to enter and/or define parameters for generating and/or updating a treatment plan using the user input device 42.
- the synchronization module 28 receives a first anatomical data set of a patient (e.g., the pre-procedural planning image 14) and a second anatomical data set of the patient (e.g., the pre-procedural tracking data 20), each data set including one or more time points. Based on the first and second data sets, the synchronization module aligns the time points of the first data set with the time points of the second data set based on phase in a motion cycle (e.g., respiratory cycle) of the patient.
- a motion cycle e.g., respiratory cycle
- the synchronization can be performed by direct analysis of the first and second data sets.
- the motion cycle can be extracted for each of the first and second data sets from analysis (e.g., image analysis) of features of the first and second data sets.
- the extracted motion cycles can then be aligned and used to align the time points of the first and second data sets.
- the first and second data sets can be matched (e.g., using image matching) to determine a best alignment of the time points of the first and second data sets.
- the synchronization can also be performed by analysis of metadata of the first and second data sets.
- each time point in the first and second data sets can be annotated with cardiac or respiratory phase, whereby cardiac or respiratory phase in the metadata is used for alignment of the time points of the first and second data sets.
- each time point in the first and second data sets can be annotated with a time stamp, whereby time in the metadata is used to perform alignment of the time points of the first and second data sets.
- the synchronization can also be performed by a combination of the foregoing approaches or any other approach to synchronizing.
- the registration module 30 translates a first anatomical data set within a first coordinate frame (e.g., the pre-procedural tracking data 20) to a second coordinate frame of a second anatomical data set (e.g., the pre-procedural planning image 14).
- a first coordinate frame e.g., the pre-procedural tracking data 20
- a second coordinate frame of a second anatomical data set e.g., the pre-procedural planning image 14.
- Any number of well-known registration algorithms can be employed with anatomical features or artificial features shared by the first and second data sets.
- the first data set can be deformably registered to the second data set using a deformable image registration (DIR) algorithm with anatomical features shared by the first and second data sets.
- DIR deformable image registration
- the first data set can be registered to the second data set using fiducial markers implanted with the patient 16 and common to both the first and second data sets.
- Fiducial markers are ideal where anatomical features are difficult to find on both data sets due to, for example, changes in anatomy because of EBRT or disease progression.
- the first data set can be registered to the second data set using segmented structures in the first and second data sets.
- the registration includes synchronizing the first and second data sets. In some instances, this includes synchronizing the first and second data sets using the synchronization module 28.
- a 4D registration algorithm can then be applied to register the first synchronized data set with the second synchronized data set, or a 3D registration algorithm can then be applied, for each pair of aligned time points, to register data in the first synchronized data set corresponding to the first time point of the pair to data in the second synchronized data set corresponding to the second time point of the pair.
- a 4D registration algorithm can be applied to register the first data set with the second data set without first synchronizing the first and second data sets using the synchronization module 28, thereby synchronizing the first and second data sets.
- the 4D registration algorithm can find the best spatial matches between time points of the first and second data sets.
- the body surface tracking module 32 receives an anatomical image (e.g., the pre-procedural planning image 14) including a target, normal tissue surrounding the target and, typically, an external body surface adjacent the target. Further, the body surface tracking module 32 receives a delineation of the target in the anatomical image. Alternatively, the delineation of the target in the anatomical image is determined using the segmentation module 24. Even more, the body surface module 32 receives tracking data (e.g., the pre-procedural tracking data 20) describing the external body surface.
- anatomical image e.g., the pre-procedural planning image 14
- the body surface tracking module 32 receives a delineation of the target in the anatomical image. Alternatively, the delineation of the target in the anatomical image is determined using the segmentation module 24. Even more, the body surface module 32 receives tracking data (e.g., the pre-procedural tracking data 20) describing the external body surface.
- the tracking data can be: 1) tracking data describing the external body surface during the generation of the anatomical image; 2) tracking data describing the external body surface during the generation of another anatomical image sharing common ROIs (e.g., the target) with the anatomical image and synchronized with the other anatomical image; or 3) other tracking data describing the external body surface.
- the anatomical image is typically a collection of 3D images (i.e., a 4D image) collected over a period of time, such as a plurality of respiratory cycles.
- the tracking data is a collection of sample measurements collected over the period of time or another period of time of similar in length as the period of time. For example, where the anatomical image is collected over a plurality of respiratory cycles, the tracking data is collected over a plurality of respiratory cycles, which need not be the same as the plurality of respiratory cycles of the anatomical image.
- the body surface tracking module 32 synchronizes and registers the anatomical image with the tracking data using the registration module 30.
- this is performed by registering the tracking data to the anatomical image, or vice versa, to bring the anatomical image and the tracking data into a common coordinate frame.
- this can alternatively be performed by registering both the anatomical image and the tracking data to another anatomical image or other tracking data using the registration module 30.
- the registration module 30 can be used to synchronize the anatomical image and the tracking data using, for example, time stamps.
- the registration module 30 can synchronize the anatomical image with the tracking data by registering or synchronizing the anatomical image to the other anatomical image.
- registration module 30 can synchronize the anatomical image using any suitable approach.
- correspondences between the external body surface in the tracking data and the target delineation in the anatomical image are determined by the body surface tracking module 32. For each aligned pair of time points of the anatomical image and the tracking data, correspondences between the external body surface and the target delineation are determined to create, for example, a distance vector, each element of the distance vector corresponding to the distance between a pair of points, one for the delineated target and one for the external body surface. In some instances the correspondences (e.g., the distance vectors) are all combined. Alternatively, the correspondences (e.g., the distance vectors) can be grouped based on motion phase (e.g., respiratory phase) and combined within the groups. Correspondences can be combined by, for example, averaging.
- motion phase e.g., respiratory phase
- the body surface tracking module 32 can extract a motion pattern (i.e., one or more motion cycles) from the tracking data when the tracking data is collected over a period of time.
- a motion pattern i.e., one or more motion cycles
- a respiratory pattern can be extracted when the tracking data described the external chest surface of a patient and the tracking data is collected over a period of time.
- the tracking data is first registered and synchronized before extracting the motion pattern.
- the optimization module 34 receives input data including: 1) plan parameters for generating a treatment plan for a patient from, for example, the user interface module 26; 2) delineations of a target and, in some instances, surrounding OARs in a planning image of the patient from, for example, the segmentation module 24; 3) typically a motion pattern of the patient (e.g., a respiratory pattern describing one or more respiratory cycles when the target resides in the chest of the patient); and 4) other relevant inputs, such as a delivered dose distribution indicating dose delivered to the target.
- the delineations of the target and/or the OARs define motion patterns for the target and/or the OARs.
- the optimization module 34 Based on the received input data, the optimization module 34 generates and/or updates a treatment plan to comply with the plan parameters.
- the treatment plan can include a planning target volume (PTV), including margins around the target, and a plurality of fractions, each fraction specifying beam directions and beam energies.
- the treatment plan can include a plurality of PTVs, each including different margins around the target, and a plurality of fractions, each fraction specifying beam directions and energies for irradiating the PTV.
- the plurality of PTVs suitably correspond to different motion phases (e.g., respiratory phases when the target resides in the chest of the patient).
- PTV location is fixed; whereas, in other instances, PTV location is dynamic based on the location of the target and/or the motion cycle.
- the pre-procedural planning image 14 is segmented to delineate the target and, where applicable, the OARs in the pre-procedural planning image 14 using the segmentation module 24.
- the optimization module 34 is then provided: 1) the delineations of the target and, where applicable, the OARs; 2) plan parameters (e.g., from the user interface module 26); and 3) typically a motion pattern of the patient 14 (e.g., the respiratory pattern of the patient).
- the motion pattern can be determined from the pre-procedural tracking data 20 using, for example, the body surface tracking module 32. Based on these inputs, the optimization module 34 generates the treatment plan 36.
- correspondences between the external body surface and the target are determined using the body surface tracking module 32. Registration between the pre-procedural planning image 14 and the pre-procedural tracking data 20 can be readily performed anatomical features of the external body surface when the pre-procedural planning image 14 includes the external body surface of the pre- procedural tracking data 20.
- the patient 16 is set-up on a treatment couch 44 of a therapy delivery apparatus 46 of the therapy system 10.
- the treatment plan 36 and the correspondences 38 are updated before carrying out the treatment fraction while the patient 16 is on the treatment couch 44.
- the updating can be performed for each treatment fraction.
- the updating can be performed every predetermined number of treatment fraction, such as every other treatment fraction, or every predetermined period of time, such as once every week. In some instances, the updating is always performed for the first treatment fraction.
- a pre-fraction imaging system 48 To perform the updating for a treatment fraction, a pre-fraction imaging system 48 generates a pre-fraction planning image 50 of the target of the patient 16, and the normal tissue of the patient 16 surrounding the target.
- the pre-fraction planning image 50 can further include the external body surface of the patient 16 included in the pre-procedural planning image 14.
- the dimensions of the pre-fraction planning image 50 are equal to the dimensions of the pre-procedural planning image 14. For example, if the pre-procedural planning image 14 is 4D, the pre-fraction planning image 50 is 4D.
- the time points of the pre-fraction planning image 50 are suitably captured to align with the time points of the pre-procedural image 14.
- the time points of the pre-procedural image 14 correspond to the phases of the respiratory cycle of the patient 16
- the time points of the pre-fraction image 50 correspond to the phases of the respiratory cycle of the patient 16.
- the pre-fraction planning image 50 can be generated using any imaging modality but is suitably generated using one of CT, PET, US, MR, SPECT, CBCT, and the like. Due to its small size and ready deployment in conjunction with a patient positioned in a linear particle accelerator (LINAC) or other EBRT system, US is particularly advantageous.
- LINAC linear particle accelerator
- pre-fraction tracking data 52 is generated by a delivery tracking system 54 of the therapy system 10 during the generation of the pre-fraction planning image 50.
- the pre-fraction tracking data 52 includes a sample measurement of the external body surface at time points of the pre-fraction planning image 50.
- the delivery tracking system 54 typically measures the external body surface of the patient 16 using one or more high speed cameras.
- the planning system 22 receives the pre-fraction planning image 50 (e.g., a 4D US image) from the pre-fraction imaging system 48 and, in some instances, the pre- fraction tracking data 52 (e.g., a 4D image of the body surface of the patient 16) from the delivery tracking system 54.
- the target and, where applicable, the OARs are delineated in the pre-fraction planning image 50 using the segmentation module 24.
- the delineations in the pre-fraction planning image 50 can be based off the delineations in the pre-procedural planning image 14.
- the pre-fraction planning image 50 is registered to the pre-procedural planning image 14, or vice versa, using the registration module 30.
- This allows the delineations in the pre- procedural planning image 14 to be translated in to the coordinate frame of the pre-fraction planning image 50, or vice versa.
- the delineations in the pre-procedural planning image 14 are translated in to the coordinate frame of the pre-fraction planning image of the first treatment fraction of the treatment plan 36.
- pre-fraction planning images of subsequent treatment fractions of the treatment plan 36 are translated to the coordinate frame of the pre-fraction planning image of the first treatment fraction to use the translated delineations.
- the optimization module 34 After segmenting the pre-fraction planning image 50, the optimization module 34 is provided with: 1) delineations of the target and, where applicable, the OARs in the pre- fraction planning image 50; 2) plan parameters; and 3) typically a motion pattern of the patient 14. The motion pattern can be based off the pre-fraction tracking data 52 or some other external data. Based on these inputs, the optimization module 34 updates the treatment plan 36 by re-optimizing one or more of treatment beam directions, the one or more PTVs, and other parameters of the treatment plan 36.
- the correspondences between the external body surface and the target are updated using the body surface tracking module 32.
- the updating uses the pre-fraction planning image 50, and typically the pre-fraction tracking data 52 or the pre-procedural tracking data 20. It includes registering and synchronizing the pre-fraction planning image 50 with the pre-procedural tracking data 20. Where the pre- procedural tracking data 20 is used, the registration and synchronization of the pre-fraction planning image 50 with the pre-procedural tracking data 20 can use the known synchronization between the pre-procedural planning data 14 and the pre-procedural tracking data 20.
- the pre-fraction planning image 50 is registered to the pre- procedural planning image 14, or vice versa, using the registration module 30.
- This synchronizes the pre-fraction planning image 50 and the pre-procedural planning image 14, whereby the synchronization between the pre-procedural tracking data 20 and the pre- fraction tracking data 50 can also be determined.
- the segmentations in the pre-fraction and pre-procedural images 14, 50 can be used to perform registration.
- synchronization between the pre-procedural tracking data 20 and the pre- fraction planning image of the first treatment fraction are determined as described above.
- pre-fraction planning images of subsequent treatment fractions of the treatment plan 36 are synchronized with the pre-fraction planning image of the first treatment fraction, whereby the synchronization between the pre-procedural tracking data 20 and the pre- fraction tracking data of the subsequent treatment fractions can also be determined.
- the therapy delivery apparatus 46 such as a LINAC, delivers therapy, such as ablation therapy, to the patient 14.
- the therapy typically includes radiation, such as one or more of x-rays, protons, high-intensity focused ultrasound (HIFU), and the like.
- the therapy delivery apparatus 46 is controlled by a delivery control system 56 of the therapy system 10 in accordance with the treatment plan 36 and the correspondences 38.
- the delivery control system 56 receives real-time tracking data 58 from the delivery tracking system 54.
- the real-time tracking data 58 includes the external body surface of the pre-fraction planning 50 and pre-procedural planning image 14.
- the delivery control system 56 determines the current motion phase using the real-time tracking data 58 or some other external data.
- the parameters of the treatment plan e.g., the PTV are then dynamically adjusted based on the current motion phase.
- the delivery control system 56 determines the location of the target based on the real-time tracking data 58.
- a body surface tracking module 60 determines the location of the target based on the real-time tracking data 58.
- Alignment of the coordinate frames of the therapy delivery apparatus 46, the treatment plan 36, the correspondences 38 and the real-time tracking data can be based on the known relationship between the coordinate frames of the pre-fraction imaging system 48 and the therapy delivery apparatus 46 (e.g., determined by a calibration procedure).
- the coordinate frame of the real-time tracking data 58 can be translated to the coordinate frame of the therapy delivery apparatus 46 by, for example, registering the realtime tracking data 58 to the pre-fraction planning image 48 and using the known relationship.
- Translation between the coordinate frame shared by the treatment plan 36 and the correspondences 38 and the coordinate frame of the therapy delivery apparatus 46 can also be performed by registering the planning image used to generate or update the treatment plan 36 with the pre-fraction planning image 48, or some other image generated by the pre-fraction imaging system 48, and using the known relationship.
- the location of the target can be determined by translating the real-time tracking data 58 to the coordinate frame of the therapy delivery apparatus 46, or vice versa, and applying the correspondences 38 to the realtime tracking data 58.
- the correspondences 38 are distance vectors
- the distances can be added or subtracted from the external body surface, as necessary.
- the correspondences 38 correspond to specific motion phases. Hence, the appropriate correspondences are used for the current motion phase, determined above.
- the delivery control system 56 can perform delivery gating by a gate module 62.
- the gate module 62 can gate the treatment beams based on whether the current motion phase corresponds to one of the PTVs. Additionally, the location of the one or more PTVs of the treatment plan 36 can be defined in the treatment plan 36. If the determined location of the target falls outside the current PTV, the treatment beams can be gated off until the target returns to the PTV.
- Dynamic PTV refers to changing the location of the current PTV based on the determined location of the target. For example, the path of the target is followed by changing the positions of multi-leaf collimator (MLC) leaves on a linear accelerator (linac).
- MLC multi-leaf collimator
- linac linear accelerator
- the planning system 22 and the delivery control system 56 include one or more program memories 66, 68 and one or more processors 70, 72.
- the program memories 66, 68 store processor executable instructions for carrying out the functions associated with the planning system 22 and the delivery control system 56, including those associated with the user interface module 26, the segmentation module 24, the body surface tracking modules 32, 60, the optimization module 34, the registration module 30, the synchronization module 28, the gate module 62, and the dynamic PTV module 64.
- the processors 70, 72 execute the processor executable instructions stored on the memories 66, 68.
- the planning system 22 and/or the delivery control system 56 further include one or more system buses 74, 76 facilitating communication between the processors 70, 72, the program memories 66, 68, the user input device 42, and the user output device 40.
- a pre-procedural planning 4D CT image 102 is generated using, for example, the pre-procedural imaging system 12. Further, body surface tracking data 104 for the pre-procedural planning 4D CT image is generated using, for example, the pre-procedural body surface tracking system 18. The pre-procedural planning 4D CT image 102 is then used for treatment planning 106, thereby generating a treatment plan. Further, the pre-procedural planning 4D CT image 102 is used in conjunction with the body surface tracking data 104 to generate 108 distance vectors spanning from an external body surface of the patient 16 to the internal surface of the target of the patient 16. The external body surface is determined from the body surface tracking data 104, and the internal target surface is determined from the pre-procedural 4D CT image 102.
- pre-fraction planning images 110 are generated using, for example, the pre-fraction imaging system 48.
- body surface tracking data 104 for the pre-fraction planning images 110 is generated using, for example, the pre-fraction body surface tracking system 54.
- the pre-fraction planning images 110 are then used to update the treatment plan before corresponding treatment fractions (e.g., to account for a smaller target and PTV due to shrinkage from the earlier fractions).
- the updating includes registering and transforming of the pre- procedural planning 4D CT image 14 into the pre-fraction 4D US image 50 to adjust the target, OARs and other anatomy for the changes in size, shape and relationship due to the treatment in the proceeding fractions.
- the pre-fraction planning images 110 are used in conjunction with the body surface tracking data 104 to update 108 distance vectors spanning from the external body surface of the patient 16 to the internal surface of the target of the patient 16.
- the pre-procedural planning 4D CT 102, the body surface tracking data 104 and the pre-fraction planning images 110 are temporally synchronized, for example, based on motion phase, such as respiratory phase.
- the patient undergoes therapy delivery 112 using, for example, the therapy delivery apparatus 46.
- Therapy is delivered using adaptive paradigms, such as gating and dynamic PTV, based on the treatment plan, the distance vectors, and real-time body surface tracking data 104.
- a pre-procedural planning 4D CT image 152 including a target T within a lung, surrounding normal tissue, and an external chest surface of a patient is generated. Further, tracking data 154 of the external chest surface is generated. The tracking data 154 is then registered 156 to the external chest surface 158 of the pre-procedural planning 4D CT image 152, and used to identify the respiratory pattern 160 of the patient. Further, the pre-procedural planning 4D CT image 152 is segmented 162 to delineate ROIs, such as the target.
- a lung and target motion model 164 is determined, which is represented by a vector q. Further, based on the registered surface tracking data, a surface motion model 166 of the external chest surface is determined, which is represented by a vector S. Based on the motion models 164, 166 and the respiratory phase, a distances vector between the external chest surface and the target is determined 168, for example, by taking the difference of the q and S. Further, based on the motion models 164, 166 and the respiratory phase, a treatment plan with a dynamic PTV 170 is determined 168. The PTV is denoted by the generally oval line surrounding and slightly spaced from the target T.
- pre-fraction 4D images 172 are generated.
- the pre-fraction 4D images 172 include the target, the surrounding normal tissue of the target, and typically the external chest surface of a patient. Further, additional tracking data 154 of the external chest surface is generated.
- the pre-fraction 4D images 172 and the additional tracking data 154 are then used to update 174 the lung and target motion model 164, which is represented by a vector q and the surface motion model 166, which is represented by a vector S'.
- These updated models are used to adjust 176 the dynamic PTV 170. Note that the target and PTV have shrunk from an almost circular shape in the original or an earlier PTV 170 to a smaller more kidney shape in the adjusted PTV 176.
- the patient undergoes therapy delivery 178.
- Therapy is delivered using adaptive paradigms, such as gating and dynamic PTV, based on the treatment plan, the distance vectors, and real-time tracking data 154.
- a memory includes one or more of a non-transient computer readable medium; a magnetic disk or other magnetic storage medium; an optical disk or other optical storage medium; a random access memory (RAM), read-only memory (ROM), or other electronic memory device or chip or set of operatively interconnected chips; an Internet/Intranet server from which the stored instructions may be retrieved via the Internet/Intranet or a local area network; or so forth.
- a non-transient computer readable medium includes one or more of a non-transient computer readable medium; a magnetic disk or other magnetic storage medium; an optical disk or other optical storage medium; a random access memory (RAM), read-only memory (ROM), or other electronic memory device or chip or set of operatively interconnected chips; an Internet/Intranet server from which the stored instructions may be retrieved via the Internet/Intranet or a local area network; or so forth.
- a processor includes one or more of a microprocessor, a microcontroller, a graphic processing unit (GPU), an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), and the like;
- a controller includes at least one memory and at least one processor, the processor executing processor executable instructions on the memory;
- a user input device includes one or more of a mouse, a keyboard, a touch screen display, one or more buttons, one or more switches, one or more toggles, and the like; and
- a display device includes one or more of a LCD display, an LED display, a plasma display, a projection display, a touch screen display, and the like.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Physiology (AREA)
- Radiation-Therapy Devices (AREA)
Priority Applications (5)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2016503755A JP6490661B2 (ja) | 2013-03-25 | 2014-03-17 | 改善された表面追跡ベースの動き管理及び適応的な外部ビーム放射線治療の動的計画の方法 |
| CN201480018320.8A CN105102062B (zh) | 2013-03-25 | 2014-03-17 | 自适应体外射束辐射治疗中改进的基于表面跟踪的运动管理与动态规划方法 |
| US14/772,805 US10376714B2 (en) | 2013-03-25 | 2014-03-17 | Method for improved surface tracking-based motion management and dynamic planning in adaptive external beam radiation therapy |
| EP14715689.7A EP2978496B1 (en) | 2013-03-25 | 2014-03-17 | Method for improved surface tracking-based motion management and dynamic planning in adaptive external beam radiation therapy |
| BR112015024300-2A BR112015024300B1 (pt) | 2013-03-25 | 2014-03-17 | Sistema de planejamento de terapia para tratamento de um alvo interno de um paciente, um ou mais processadores programados para controlar um sistema de controle de aplicação para tratamento de um alvo interno de um paciente, e sistema de aplicação de terapia para tratamento de um alvo interno de um paciente |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201361804827P | 2013-03-25 | 2013-03-25 | |
| US61/804,827 | 2013-03-25 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2014155232A1 true WO2014155232A1 (en) | 2014-10-02 |
Family
ID=50440720
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2014/059884 Ceased WO2014155232A1 (en) | 2013-03-25 | 2014-03-17 | Method for improved surface tracking-based motion management and dynamic planning in adaptive external beam radiation therapy |
Country Status (6)
| Country | Link |
|---|---|
| US (1) | US10376714B2 (enExample) |
| EP (1) | EP2978496B1 (enExample) |
| JP (1) | JP6490661B2 (enExample) |
| CN (1) | CN105102062B (enExample) |
| BR (1) | BR112015024300B1 (enExample) |
| WO (1) | WO2014155232A1 (enExample) |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2016187319A1 (en) * | 2015-05-18 | 2016-11-24 | Varian Medical Systems, Inc. | System for in-layer synchronization for fast spot rescanning |
| WO2017005758A1 (en) * | 2015-07-09 | 2017-01-12 | Koninklijke Philips N.V. | Radiation therapy system using plural treatment plans |
| US10342995B2 (en) | 2016-09-30 | 2019-07-09 | Varian Medical Systems Particle Therapy Gmbh. | System and method for scanned ion beam interplay effect mitigation using random repainting |
| JP2020054875A (ja) * | 2015-01-28 | 2020-04-09 | エレクタ、インク.Elekta, Inc. | 適応型放射線療法に対する3次元位置特定及び追跡 |
| CN117017496A (zh) * | 2023-09-28 | 2023-11-10 | 真健康(北京)医疗科技有限公司 | 柔性体表定位装置及穿刺手术导航定位系统 |
Families Citing this family (29)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10342996B2 (en) * | 2016-08-29 | 2019-07-09 | Accuray Incorporated | Online angle selection in rotational imaging and tracking systems |
| CN106310544A (zh) * | 2016-09-30 | 2017-01-11 | 上海联影医疗科技有限公司 | 肿瘤实时监控方法及装置、放射治疗系统 |
| US11000706B2 (en) * | 2016-12-13 | 2021-05-11 | Viewray Technologies, Inc. | Radiation therapy systems and methods |
| US10751014B2 (en) * | 2017-01-06 | 2020-08-25 | Accuray Incorporated | Using a rotating 2D X-ray imager as an imaging device to perform target tracking during radiation treatment delivery |
| CN110290832B (zh) * | 2017-01-30 | 2021-08-24 | 皇家飞利浦有限公司 | 对治疗目标在放射疗法中的可实现性的评估 |
| US11793579B2 (en) | 2017-02-22 | 2023-10-24 | Covidien Lp | Integration of multiple data sources for localization and navigation |
| EP3391940A1 (en) * | 2017-04-21 | 2018-10-24 | Koninklijke Philips N.V. | Planning system for adaptive radiation therapy |
| US11273326B2 (en) * | 2017-06-29 | 2022-03-15 | Canon Medical Systems Corporation | Radiotherapy system and treatment support apparatus |
| US10183179B1 (en) * | 2017-07-21 | 2019-01-22 | Varian Medical Systems, Inc. | Triggered treatment systems and methods |
| EP3449830B1 (de) * | 2017-08-31 | 2020-01-29 | Siemens Healthcare GmbH | Steuerung eines medizintechnischen bildgebenden systems |
| JP7109899B2 (ja) * | 2017-10-17 | 2022-08-01 | キヤノンメディカルシステムズ株式会社 | 放射線治療システム |
| CN108144196A (zh) * | 2018-02-22 | 2018-06-12 | 戴建荣 | 用于术中放疗系统的表面成像设备和方法 |
| EP3756728A1 (en) * | 2019-06-24 | 2020-12-30 | Vision RT Limited | Patient motion tracking system configured for automatic roi generation |
| US11040221B2 (en) | 2019-08-13 | 2021-06-22 | Elekta Ltd. | Adaptive radiation therapy using composite imaging slices |
| GB2601560A (en) | 2020-12-04 | 2022-06-08 | Elekta Instr Ab | Methods for adaptive radiotherapy |
| WO2022136925A1 (en) * | 2020-12-23 | 2022-06-30 | Ebamed Sa | A multiplanar motion management system |
| US11660473B2 (en) | 2020-12-30 | 2023-05-30 | Varian Medical Systems, Inc. | Radiotherapy methods, systems, and workflow-oriented graphical user interfaces |
| US11604564B2 (en) | 2020-12-30 | 2023-03-14 | Varian Medical Systems, Inc. | Radiotherapy methods, systems, and workflow-oriented graphical user interfaces |
| US11577095B2 (en) | 2020-12-30 | 2023-02-14 | Varian Medical Systems, Inc. | Radiotherapy methods, systems, and workflow-oriented graphical user interfaces |
| US11654303B2 (en) | 2020-12-30 | 2023-05-23 | Varian Medical Systems, Inc. | Radiotherapy methods, systems, and workflow-oriented graphical user interfaces |
| US11786757B2 (en) | 2020-12-30 | 2023-10-17 | Varian Medical Systems, Inc. | Radiotherapy methods, systems, and workflow-oriented graphical user interfaces |
| US11844962B2 (en) | 2020-12-30 | 2023-12-19 | Varian Medical Systems, Inc. | Radiotherapy methods, systems, and workflow-oriented graphical user interfaces |
| US11759656B2 (en) | 2020-12-30 | 2023-09-19 | Varian Medical Systems, Inc. | Radiotherapy methods, systems, and workflow-oriented graphical user interfaces |
| US11817210B2 (en) | 2020-12-30 | 2023-11-14 | Varian Medical Systems, Inc. | Radiotherapy methods, systems, and workflow-oriented graphical user interfaces |
| US11712587B2 (en) | 2020-12-30 | 2023-08-01 | Varian Medical Systems, Inc. | Radiotherapy methods, systems, and workflow-oriented graphical user interfaces |
| US11607563B2 (en) * | 2020-12-30 | 2023-03-21 | Varian Medical Systems, Inc. | Radiotherapy methods, systems, and workflow-oriented graphical user interfaces |
| US11786756B2 (en) | 2020-12-30 | 2023-10-17 | Varian Medical Systems, Inc. | Radiotherapy methods, systems, and workflow-oriented graphical user interfaces |
| US11638840B2 (en) | 2020-12-30 | 2023-05-02 | Varian Medical Systems, Inc. | Radiotherapy methods, systems, and workflow-oriented graphical user interfaces |
| WO2025131279A1 (en) * | 2023-12-21 | 2025-06-26 | Brainlab Ag | Validitation of a radiation treatment plan using surface images |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP2292300A1 (de) * | 2009-09-02 | 2011-03-09 | LAP GmbH Laser Applikationen | Vorrichtung und Verfahren zur Darstellung einer geometrischen Figur auf der Oberfläche eines Patientenkörpers |
| US20120008735A1 (en) * | 2010-06-08 | 2012-01-12 | Accuray, Inc. | Imaging Methods for Image-Guided Radiation Treatment |
| US20120226152A1 (en) * | 2011-03-03 | 2012-09-06 | Porikli Fatih M | Tumor Tracking System and Method for Radiotherapy |
Family Cites Families (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6076005A (en) | 1998-02-25 | 2000-06-13 | St. Jude Children's Research Hospital | Respiration responsive gating means and apparatus and methods using the same |
| JP4509115B2 (ja) | 2003-09-29 | 2010-07-21 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 放射線治療を計画するための方法及び装置 |
| WO2005030330A1 (en) * | 2003-09-30 | 2005-04-07 | Koninklijke Philips Electronics, N.V. | Target tracking method and apparatus for radiation treatment planning and delivery |
| US8457717B2 (en) | 2004-04-08 | 2013-06-04 | Stanford University | Method and system of adaptive control for reducing motion artifacts and patient dose in four dimensional computed tomography |
| US7713205B2 (en) * | 2005-06-29 | 2010-05-11 | Accuray Incorporated | Dynamic tracking of soft tissue targets with ultrasound images, without using fiducial markers |
| WO2007007276A2 (en) | 2005-07-14 | 2007-01-18 | Koninklijke Philips Electronics | Method of accounting for tumor motion in radiotherapy treatment |
| US7693257B2 (en) | 2006-06-29 | 2010-04-06 | Accuray Incorporated | Treatment delivery optimization |
| CN100496386C (zh) * | 2006-12-29 | 2009-06-10 | 成都川大奇林科技有限责任公司 | 精确放射治疗计划系统 |
| EP2410918A1 (en) | 2009-03-25 | 2012-02-01 | Koninklijke Philips Electronics N.V. | Method and apparatus for breathing adapted imaging |
| US9687200B2 (en) * | 2010-06-08 | 2017-06-27 | Accuray Incorporated | Radiation treatment delivery system with translatable ring gantry |
| CN102526890B (zh) * | 2012-02-29 | 2014-12-10 | 赵瑞 | 一种放疗射野的模拟定位方法 |
-
2014
- 2014-03-17 US US14/772,805 patent/US10376714B2/en active Active
- 2014-03-17 JP JP2016503755A patent/JP6490661B2/ja active Active
- 2014-03-17 WO PCT/IB2014/059884 patent/WO2014155232A1/en not_active Ceased
- 2014-03-17 BR BR112015024300-2A patent/BR112015024300B1/pt active IP Right Grant
- 2014-03-17 CN CN201480018320.8A patent/CN105102062B/zh active Active
- 2014-03-17 EP EP14715689.7A patent/EP2978496B1/en active Active
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP2292300A1 (de) * | 2009-09-02 | 2011-03-09 | LAP GmbH Laser Applikationen | Vorrichtung und Verfahren zur Darstellung einer geometrischen Figur auf der Oberfläche eines Patientenkörpers |
| US20120008735A1 (en) * | 2010-06-08 | 2012-01-12 | Accuray, Inc. | Imaging Methods for Image-Guided Radiation Treatment |
| US20120226152A1 (en) * | 2011-03-03 | 2012-09-06 | Porikli Fatih M | Tumor Tracking System and Method for Radiotherapy |
Cited By (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2020054875A (ja) * | 2015-01-28 | 2020-04-09 | エレクタ、インク.Elekta, Inc. | 適応型放射線療法に対する3次元位置特定及び追跡 |
| WO2016187319A1 (en) * | 2015-05-18 | 2016-11-24 | Varian Medical Systems, Inc. | System for in-layer synchronization for fast spot rescanning |
| US9789342B2 (en) | 2015-05-18 | 2017-10-17 | Varian Medical Systems, Inc. | System and method for in-layer synchronization for fast spot rescanning |
| US10857392B2 (en) | 2015-05-18 | 2020-12-08 | Varian Medical Systems, Inc. | System and method for in-layer synchronization for fast spot rescanning |
| WO2017005758A1 (en) * | 2015-07-09 | 2017-01-12 | Koninklijke Philips N.V. | Radiation therapy system using plural treatment plans |
| US10342995B2 (en) | 2016-09-30 | 2019-07-09 | Varian Medical Systems Particle Therapy Gmbh. | System and method for scanned ion beam interplay effect mitigation using random repainting |
| CN117017496A (zh) * | 2023-09-28 | 2023-11-10 | 真健康(北京)医疗科技有限公司 | 柔性体表定位装置及穿刺手术导航定位系统 |
| CN117017496B (zh) * | 2023-09-28 | 2023-12-26 | 真健康(北京)医疗科技有限公司 | 柔性体表定位装置及穿刺手术导航定位系统 |
Also Published As
| Publication number | Publication date |
|---|---|
| JP6490661B2 (ja) | 2019-03-27 |
| US10376714B2 (en) | 2019-08-13 |
| CN105102062B (zh) | 2018-11-13 |
| BR112015024300A2 (pt) | 2017-07-18 |
| BR112015024300B1 (pt) | 2023-02-28 |
| EP2978496A1 (en) | 2016-02-03 |
| JP2016512781A (ja) | 2016-05-09 |
| US20160016007A1 (en) | 2016-01-21 |
| CN105102062A (zh) | 2015-11-25 |
| EP2978496B1 (en) | 2018-12-19 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10376714B2 (en) | Method for improved surface tracking-based motion management and dynamic planning in adaptive external beam radiation therapy | |
| AU2024202024B2 (en) | Automatic gating with an mr linac | |
| US11413475B2 (en) | Elasticity imaging-based methods for improved gating efficiency and dynamic margin adjustment in radiation therapy | |
| EP3123443B1 (en) | Method and device for generating one or more computer tomography images based on magnetic resonance images with the help of tissue class separation | |
| Zhang et al. | A patient‐specific respiratory model of anatomical motion for radiation treatment planning | |
| CN107072628B (zh) | 用于放射治疗的图像导引 | |
| JP6656251B2 (ja) | Mri誘導リナックのモーション管理 | |
| US9757588B2 (en) | Deformable registration of images for image guided radiation therapy | |
| US11443441B2 (en) | Deep inspiration breath-hold setup using x-ray imaging | |
| CN104093450A (zh) | 用于自适应处置规划的束节段水平剂量计算与时间运动跟踪 | |
| Alam et al. | Medical image registration: Classification, applications and issues |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 201480018320.8 Country of ref document: CN |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14715689 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2014715689 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 14772805 Country of ref document: US |
|
| ENP | Entry into the national phase |
Ref document number: 2016503755 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112015024300 Country of ref document: BR |
|
| ENP | Entry into the national phase |
Ref document number: 112015024300 Country of ref document: BR Kind code of ref document: A2 Effective date: 20150922 |