WO2012012768A1 - System and method for identifying an anatomical organ in a patient - Google Patents
System and method for identifying an anatomical organ in a patient Download PDFInfo
- Publication number
- WO2012012768A1 WO2012012768A1 PCT/US2011/045092 US2011045092W WO2012012768A1 WO 2012012768 A1 WO2012012768 A1 WO 2012012768A1 US 2011045092 W US2011045092 W US 2011045092W WO 2012012768 A1 WO2012012768 A1 WO 2012012768A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- set forth
- patient
- probability
- organ
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 70
- 210000000056 organ Anatomy 0.000 title claims abstract description 61
- 238000011282 treatment Methods 0.000 claims description 89
- 230000011218 segmentation Effects 0.000 claims description 34
- 238000001959 radiotherapy Methods 0.000 claims description 33
- 210000001519 tissue Anatomy 0.000 claims description 14
- 238000012549 training Methods 0.000 claims description 12
- 210000003484 anatomy Anatomy 0.000 claims description 11
- 238000004590 computer program Methods 0.000 claims description 11
- 210000003205 muscle Anatomy 0.000 claims description 8
- 210000000988 bone and bone Anatomy 0.000 claims description 6
- 210000003491 skin Anatomy 0.000 claims description 3
- 230000005855 radiation Effects 0.000 description 55
- 238000002591 computed tomography Methods 0.000 description 23
- 210000003681 parotid gland Anatomy 0.000 description 20
- 238000004891 communication Methods 0.000 description 14
- 206010028980 Neoplasm Diseases 0.000 description 11
- 238000012545 processing Methods 0.000 description 9
- 210000004373 mandible Anatomy 0.000 description 6
- 238000002721 intensity-modulated radiation therapy Methods 0.000 description 5
- QRYFCNPYGUORTK-UHFFFAOYSA-N 4-(1,3-benzothiazol-2-yldisulfanyl)morpholine Chemical compound C1COCCN1SSC1=NC2=CC=CC=C2S1 QRYFCNPYGUORTK-UHFFFAOYSA-N 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 4
- 210000000920 organ at risk Anatomy 0.000 description 4
- 238000002203 pretreatment Methods 0.000 description 4
- 238000010521 absorption reaction Methods 0.000 description 3
- 238000010276 construction Methods 0.000 description 3
- 210000003128 head Anatomy 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 229920001621 AMOLED Polymers 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 210000002216 heart Anatomy 0.000 description 2
- 210000001847 jaw Anatomy 0.000 description 2
- 238000002156 mixing Methods 0.000 description 2
- 238000012636 positron electron tomography Methods 0.000 description 2
- 210000002307 prostate Anatomy 0.000 description 2
- 210000000664 rectum Anatomy 0.000 description 2
- 239000000126 substance Substances 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 238000002560 therapeutic procedure Methods 0.000 description 2
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 1
- 0 C1C2*(C*3)CC3CCC12 Chemical compound C1C2*(C*3)CC3CCC12 0.000 description 1
- 208000005946 Xerostomia Diseases 0.000 description 1
- 230000001154 acute effect Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 210000000577 adipose tissue Anatomy 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000004204 blood vessel Anatomy 0.000 description 1
- 238000002725 brachytherapy Methods 0.000 description 1
- 210000000133 brain stem Anatomy 0.000 description 1
- 210000000269 carotid artery external Anatomy 0.000 description 1
- 238000013170 computed tomography imaging Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 206010013781 dry mouth Diseases 0.000 description 1
- 210000000959 ear middle Anatomy 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000010894 electron beam technology Methods 0.000 description 1
- 238000002594 fluoroscopy Methods 0.000 description 1
- 238000013467 fragmentation Methods 0.000 description 1
- 238000006062 fragmentation reaction Methods 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 238000002786 image-guided radiation therapy Methods 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000002595 magnetic resonance imaging Methods 0.000 description 1
- 210000001595 mastoid Anatomy 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 210000004789 organ system Anatomy 0.000 description 1
- 238000002727 particle therapy Methods 0.000 description 1
- 238000012831 peritoneal equilibrium test Methods 0.000 description 1
- 210000003800 pharynx Anatomy 0.000 description 1
- 238000012877 positron emission topography Methods 0.000 description 1
- 210000003296 saliva Anatomy 0.000 description 1
- 238000002603 single-photon emission computed tomography Methods 0.000 description 1
- 210000004872 soft tissue Anatomy 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 210000003582 temporal bone Anatomy 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 210000000515 tooth Anatomy 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
- 230000004580 weight loss Effects 0.000 description 1
- 210000000216 zygoma Anatomy 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/1048—Monitoring, verifying, controlling systems and methods
- A61N5/1049—Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/77—Determining position or orientation of objects or cameras using statistical methods
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
- A61B6/032—Transmission computed tomography [CT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/1048—Monitoring, verifying, controlling systems and methods
- A61N5/1049—Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
- A61N2005/1061—Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam using an x-ray imaging system having a separate imaging source
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/103—Treatment planning systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10081—Computed x-ray tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
Definitions
- Image-guided radiation therapy employs a radiation therapy treatment delivery system to provide radiation therapy to a patient.
- IGRT uses cross-sectional images of the patient's internal anatomy to better target the radiation dose to the tumor while reducing the radiation exposure to healthy organs.
- the radiation dose delivered to the tumor is controlled with intensity modulated radiation therapy (“IMRT”), which involves changing the size, shape, and intensity of the radiation beam to conform to the size, shape, and location of the patient's tumor.
- IMRT and IMRT lead to improved control of the radiation delivered to the tumor while reducing the potential for acute side effects due to irradiation of healthy tissue surrounding the tumor.
- Radiation therapy typically utilizes x-ray energies in the range of 1-18 megavolts (MV). This is in contrast to imaging a patient for diagnostic purposes.
- the radiation delivered to a patient for diagnostic images typically utilizes x-ray energies that are less than 1 MV.
- Radiation therapy planning aims to maximize the dose gradient between the target volumes of radiation and organs at risk (OAR), which are healthy organs located near the tumor. This process involves delineating the boundaries of these organs, which is usually accomplished through a blend of manual contouring and editing of automatically generated contours. For example, contouring cancers of the head and neck (H&N) demands a more meticulous attention to detail than does most other regions. The OAR in the H&N exhibit non-convex forms, such that manual drawing requires several hours of tedious work. [0005] Therefore, radiation therapy planning must attempt to avoid radiation to these healthy organs (e.g., parotid glands, heart, rectum, etc.) while delivering radiation to the tumor.
- these healthy organs e.g., parotid glands, heart, rectum, etc.
- parotid glands are a prominent pair of OAR that appear in roughly 95% of H&N contoured treatment plans.
- Each parotid gland consists of a superficial lobe and a deep lobe connected by a narrow isthmus, yielding an asymmetric shape.
- the parotid glands are responsible for producing saliva, and excessive radiation of the parotid glands can cause xerostomia and other conditions that may adversely affect a patient's quality of life.
- CT computed tomography
- the parotid tissue appears between fat and muscle in intensity, but the range is much broader, partially overlapping both fat and muscle.
- the invention relates to a system and method for identifying an anatomical organ in a patient.
- the invention provides a method of identifying an anatomical organ in a patient. The method includes acquiring an image of the patient, the image including a plurality of image elements; segmenting the image to identify a region of interest defined by at least one anatomic landmark; generating a probability map based on at least one probabilistic anatomical atlas; and applying the probability map to a probabilistic classification to determine a probability that the organ is located in the region of interest.
- the invention provides a computer program embodied by a computer readable medium capable of being executed by a computer, the computer program for use in a radiation therapy treatment system.
- the computer program includes an image generation module operable to acquire an image of a patient, the image including a plurality of image elements; a segmentation module operable to segment the image to identify a region of interest defined by at least one anatomic landmark; a probability map generation module operable to generate a probability map based on at least one probabilistic anatomical atlases; and an organ identification module operable to apply the probability map to a probabilistic classification to determine a probability that an organ is located in the region of interest.
- the invention provides a radiation therapy treatment system for identifying an anatomic organ in a patient.
- the radiation therapy treatment system includes an image acquisition device operable to acquire an image of the patient, the image including a plurality of image elements; a processor; and a computer readable medium storing non-transitory programmed instructions.
- the non-transitory programmed instructions segment the image to identify a region of interest defined by at least one anatomic landmark, generate a probability map based on at least one probabilistic anatomical atlas, and apply the probability map to a probabilistic classification to determine a probability that the organ is located in the region of interest.
- the invention provides a radiation therapy treatment system for identifying an anatomic organ in a patient.
- the radiation therapy treatment system includes a treatment delivery system and a treatment planning system.
- the treatment delivery system includes a gantry, a radiation source supported by the gantry, a detector supported by the gantry and positioned opposite the radiation source, and a couch to support a patient.
- the treatment planning system includes a computer processor configured to receive instructions from at least one of an image module, a treatment plan module, a contour module, a patient positioning module, a deformation module, a segmentation module, a probability map generation module, and an organ identification module. These modules, in conjunction with the computer processor, operate to process data, manipulate data, analyze data, and format data for display.
- these modules generate and provide instructions to the processor to segment the image to identify a region of interest defined by at least one anatomic landmark, generate a probability map based on at least one probabilistic anatomical atlas, and apply the probability map to a probabilistic classification to determine a probability that the organ is located in the region of interest.
- FIG. 1 is a schematic diagram of a radiation therapy treatment system.
- FIG. 2 is a perspective view of the radiation therapy treatment system.
- FIG. 3 is a perspective view of a multi-leaf collimator that can be used in the radiation therapy treatment system illustrated in FIG. 2.
- FIG. 4 is a schematic illustration of the radiation therapy treatment system of FIG. 2.
- FIG. 5 is a schematic diagram of a software program used in the radiation therapy treatment system.
- FIG. 6 is a flow chart of a method of identifying an anatomical organ in a patient according to an embodiment of the invention.
- FIG. 7 illustrates an image segmented into couch, body, and background (left), air, fat, muscle, bone, and skin tissues (center), and different organs (right).
- FIG. 8 is a schematic illustration of the hierarchical steps of a segmentation process embodying the invention.
- FIG. 9 shows three slices of a probability map computed from probabilistic anatomical atlases representing training data.
- FIG. 10 illustrates an image with manual contours overlaid (left), the atlas contours overlaid prior to deformation (center), and after deforming by a Marionette method (right).
- embodiments of the invention may include hardware, software, and electronic components or modules that, for purposes of discussion, may be illustrated and described as if the majority of the components were implemented solely in hardware.
- the electronic based aspects of the invention may be implemented in software (e.g., stored on non-transitory computer-readable medium).
- a plurality of hardware and software based devices, as well as a plurality of different structural components may be utilized to implement the invention.
- FIG. 1 illustrates a radiation therapy treatment system 10 that provides radiation therapy to a patient 14 according to one embodiment of the present invention.
- the radiation therapy treatment system 10 includes a treatment delivery system 11 and a treatment planning system 12.
- the radiation therapy treatment can include photon-based radiation therapy, brachytherapy, electron beam therapy, proton, neutron, particle therapy, or other types of treatment therapy.
- the treatment delivery system 11 includes a gantry 18.
- the gantry 18 supports a radiation module 22, which includes a radiation source 24 and a LINAC 26 that generates a beam 30 of radiation.
- a radiation module 22 which includes a radiation source 24 and a LINAC 26 that generates a beam 30 of radiation.
- the gantry 18 shown in FIG. 2 is a ring gantry (i.e., it extends through a full 360° arc to create a complete ring or circle), other types of mounting arrangements may also be employed.
- a C-type, partial ring gantry, or robotic arm gantry arrangement could be used.
- Any other framework capable of positioning the radiation module 22 at various rotational and/or axial positions relative to the patient 14 may also be employed.
- the radiation source 24 may travel in a path that does not follow the shape of the gantry 18.
- the radiation source 24 may travel in a non-circular path even though the illustrated gantry 18 is generally circular-shaped.
- the gantry 18 of the illustrated embodiment defines a gantry aperture 32 into which the patient 14 moves during treatment.
- the radiation module 22 also includes a modulation device 34 operable to modify or modulate the radiation beam 30.
- the modulation device 34 modulates the radiation beam 30 and directs the radiation beam 30 toward the patient 14.
- the radiation beam 30 is directed toward a portion 38 of the patient 14.
- the portion 38 may include the patient's entire body, but is generally smaller than the patient's entire body and can be defined by a two- dimensional area and/or a three-dimensional volume.
- a portion may include one or more regions of interest.
- a portion desired to receive the radiation which may be referred to as a target 38 or target region, is an example of a region of interest.
- Another type of region of interest is a region at risk.
- the modulation device 34 can include a collimation device 42 as illustrated in FIG. 2.
- the collimation device 42 includes a set of jaws 46 that define and adjust the size of an aperture 50 through which the radiation beam 30 may pass.
- the jaws 46 include an upper jaw 54 and a lower jaw 58. The upper jaw 54 and the lower jaw 58 are moveable to adjust the size of the aperture 50.
- the modulation device 34 can comprise a multi-leaf collimator 62, which includes a plurality of interlaced leaves 66 operable to move from a first position to a second position, to provide intensity modulation of the radiation beam 30. It is also noted that the leaves 66 can move to a position anywhere between a minimally and maximally-open position.
- the plurality of interlaced leaves 66 modulate the strength, size, and shape of the radiation beam 30 before the radiation beam 30 reaches the target 38 on the patient 14.
- Each of the leaves 66 is independently controlled by an actuator 70, such as a motor or an air valve so that the leaf 66 can open and close quickly to permit or block the passage of radiation.
- the actuators 70 can be controlled by a computer 74 and/or controller.
- the treatment delivery system 11 can also include a detector 78, e.g., a kilovoltage or a megavoltage detector, that receives the radiation beam 30.
- the linear accelerator 26 and the detector 78 can also operate as a computed tomography (CT) system to generate CT images of the patient 14.
- CT computed tomography
- the linear accelerator 26 emits the radiation beam 30 toward the target 38 in the patient 14.
- the target 38 absorbs some of the radiation.
- the detector 78 detects or measures the amount of radiation absorbed by the target 38.
- the detector 78 collects the absorption data from different angles as the linear accelerator 26 rotates around and emits radiation toward the patient 14.
- the collected absorption data is transmitted to the computer 74 for processing of the absorption data and generating images of the patient's body tissues and organs.
- the images can also illustrate bone, soft tissues, and blood vessels.
- the CT images can be acquired with a radiation beam 30 that has a fan-shaped geometry, a multi-slice geometry or a cone -beam geometry.
- the CT images can be acquired with the linear accelerator 26 delivering megavoltage energies or kilovoltage energies.
- the acquired CT images can be registered with previously acquired CT images (from the radiation therapy treatment system 10 or other image acquisition devices, such as other CT scanners, MRI systems, and PET systems).
- the previously acquired CT images for the patient 14 can include identified targets 38 made through a contouring process.
- the newly acquired CT images for the patient 14 can be registered with the previously acquired CT images to assist in identifying the targets 38 in the new CT images.
- the registration process can use rigid or deformable registration tools.
- the image data can be presented on a display as either a three-dimensional image or a series of two-dimensional images.
- the image data comprising the images can be either voxels (for three-dimensional images) or pixels (for two-dimensional images).
- image element is used generally in the description to refer to both.
- the treatment delivery system 11 also includes a patient support device, such as a couch 82 (illustrated in FIG. 1), which supports the patient 14.
- the couch 82 or at least portions thereof, moves into and out of the field of radiation along an axis 84.
- the couch 82 is also capable of moving along the X and Z axes as illustrated in FIG. 1.
- the patient support can be a device that is adapted to support any portion of the patient's body.
- the patient support is not limited to having to support the entire patient's body.
- the system 11 also can include a drive system 86 operable to manipulate the position of the couch 82.
- the drive system 86 can be controlled by the computer 74.
- the treatment planning system 12 includes the computer 74, which is embodied as an operator station to be accessed by medical personnel.
- the computer 74 includes a controller 75, a user interface module 76, a display 77, and a communications module 79.
- the controller 75 and the user interface module 76 include combinations of software and hardware that are operable to, among other things, control the operation of the treatment delivery system 11 and the information that is presented on the display 77.
- the controller 75 includes, for example, a processing unit 80 (e.g., a microprocessor, a microcontroller, or another suitable programmable device), a memory 81, and a bus 83.
- the bus 83 connects various components of the controller 75, including the memory 81, to the processing unit 80.
- the processing unit 80 may represent one or more general-purpose processors, a special purpose processor such as a digital signal processor or other type of device such as a controller or field programmable gate array.
- the controller 75, the user interface module 76, the display 77, and the communications module 79 are illustrated as part of a single server or computing device, the components of the treatment planning system 12 can be distributed over multiple servers or computing devices.
- the treatment planning system 12 can include multiple controllers 75, user interface modules 76, displays 77, and communications modules 79,
- the memory 81 includes, for example, a read-only memory (“ROM”), a random access memory (“RAM”), an electrically erasable programmable read-only memory
- the processing unit 80 is connected to the memory 81 and executes software program 90 that is capable of being stored in the RAM (e.g., during execution), the ROM (e.g., on a generally permanent basis), or another non-transitory computer readable medium such as another memory or a disc. Additionally or alternatively, the memory 81 is included in the processing unit 80.
- the controller 75 also includes an input/output (“I/O") system 85 that includes routines for transferring information between components within the controller 75 and other components of the treatment planning system 12.
- Software included in the implementation of the treatment planning system 12 is stored in the memory 81 of the controller75.
- the software includes, for example, firmware, one or more applications, program data, one or more program modules, and other executable instructions.
- the controller 75 is configured to retrieve from memory and execute, among other things, instructions related to the methods described below.
- the user interface module 76 is configured for user control of the treatment planning system 12.
- the user interface module 76 is operably coupled to the controller 75 to control the information presented on the display 77.
- the user interface module 76 can include a combination of digital and analog input or output devices required to achieve a desired level of control for the treatment planning system 12.
- the user interface module 76 can include input devices such as a touch-screen display, a plurality of knobs, a plurality of dials, a plurality of switches, a plurality of buttons, or the like.
- the display 77 is, for example, a liquid crystal display (“LCD”), a light-emitting diode (“LED”) display, an organic LED (“OLED”) display, an electroluminescent display (“ELD”), a surface-conduction electron-emitter display (“SED”), a field emission display (“FED”), a thin-film transistor (“TFT”) LCD, or the like.
- the display 77 is a Super active-matrix OLED (“AMOLED”) display.
- the treatment planning system 12 is also configured to connect to a network (e.g., a WAN, a LAN, or the like) via the communications module 79 to access other programs, software, or treatment planning systems 12, or treatment delivery systems 11.
- the communications module 79 can include a network interface, such as an Ethernet card or a wireless network card, that allows the treatment planning system 12 to send and receive information over a network, such as a local area network or the Internet.
- the communications module 79 includes drivers configured to receive and send data to and from various input and/or output devices, such as a keyboard, a mouse, a printer, etc. Data
- LAN wireless local area network
- WAN wide area network
- the communications module 79 is also compatible with the Digital Imaging and Communications in Medicine (DICOM) protocol with any version and/or other required protocol.
- DICOM is an international communications standard developed by NEMA that defines the format used to transfer medical image-related data between different pieces of medical equipment.
- DICOM RT refers to the standards that are specific to radiation therapy data.
- FIGS. 1 and 4 generally represent two-way communication and information transfer where indicated. However, for some medical and computerized equipment, only one-way communication and information transfer may be necessary.
- the processing unit 80 executes instructions stored in the computer-readable media.
- the instructions can include various components or modules configured to perform particular functionality when executed by the processing unit 80.
- the computer-readable media includes a treatment planning process application that interacts with the user interface module 76 to display on the display 77 various "screens" or "pages" related to the patient's treatment plan.
- all of the screens of the user interface are not limited to the arrangement as shown in any of the drawings.
- the screens may include, but are not limited to fields, columns, rows, dialog boxes, tabs, buttons, radio buttons, and drop down menus. Field titles may vary and are not limited to that shown in the drawings.
- the treatment planning system 12 can represent a server that hosts the treatment planning process application as a network-based tool or application. Therefore, a user can access the treatment planning process application through a network, such as the Internet. Accordingly, in some embodiments, a user is not required to have the treatment planning process application permanently installed on the computer 74. Rather, the user can access the treatment planning process application using a browser application, such as Internet Explorer®.
- the software program 90 includes a plurality of modules that interact or communicate with one another to provide instructions to the processing unit for generating a treatment plan for a patient, modifying or adapting a treatment plan, acquiring images of the patient, and controlling the components of the treatment delivery system 11.
- the software program 90 includes an image module 102 operable to acquire or receive images of at least a portion of the patient 14.
- the image module 102 can generate instructions for the on-board or on-line image device, such as a CT imaging device to acquire images of the patient 14 before treatment commences, during treatment, and after treatment according to desired protocols.
- the on-board or on-line image device can comprise the radiation source and the detector, where the radiation source delivers kV or MV radiation to the patient that is collected by the detector and processed into a three-dimensional (e.g., CT) image.
- CT images the data comprising the patient images are composed of image elements, which represent image elements stored as data in the radiation therapy treatment system. These image elements may be any data construct used to represent image data, including two-dimensional pixels or three-dimensional voxels.
- the images can be stored in memory or in a database and retrieved by the image module 102 for later use.
- the image module 102 acquires an image of the patient 14 while the patient 14 is substantially in a treatment position.
- Other off-line imaging devices or systems may be used to acquire pre-treatment images (e.g., three-dimensional) of the patient 14, such as non- quantitative CT, MRI, PET, SPECT, ultrasound, transmission imaging, fluoroscopy, RF-based localization, and the like.
- the acquired images can be used for registration/alignment of the patient 14 with respect to the gantry or other point and/or to determine or predict a radiation dose to be delivered to the patient 14.
- the acquired images also can be used to generate a
- the deformation map to identify the differences between one or more of the planning images and one or more of the pre-treatment (e.g., a daily image), during-treatment, or after-treatment images.
- the acquired images also can be used to determine a radiation dose that the patient 14 received during the prior treatments.
- the image module 102 also is operable to acquire images of at least a portion of the patient 14 while the patient is receiving treatment to determine a radiation dose that the patient 14 is receiving in real-time.
- the software program 90 also includes a treatment plan module 106 including instructions for generating a treatment plan for the patient 14 based on data input to the system 10 by medical personnel, retrieved from memory or a database, received from other modules, or otherwise acquired by the system 10.
- the data can include one or more images (e.g., planning images and/or pre-treatment images) of at least a portion of the patient 14. These images may be acquired and processed by the image module 102 in the manner described in the preceding paragraphs.
- the treatment plan module 106 can separate the treatment plan into a plurality of treatment fractions and can determine the radiation dose for each fraction or treatment based on a radiation dose prescription input by medical personnel.
- the treatment plan module 106 can communicate with a contour module 115, which includes instructions for generating various contours to be drawn around the target 38. Medical personnel can utilize the contour module 115 via the user interface module 76 to contour or identify particular areas on and/or around the target 38 and to specify the amount of radiation dose for each of the areas.
- medical personnel can utilize one or more of the images to generate one or more contours on the one or more images to identify one or more treatment regions or avoidance regions of the target 38.
- the contour process can include using geometric shapes, including three-dimensional shapes to define the boundaries of the treatment region of the target 38 that will receive radiation and/or the avoidance region of the target 38 that will receive minimal or no radiation.
- the medical personnel can use a plurality of predefined geometric shapes to define the treatment region(s) and/or the avoidance region(s). The plurality of shapes can be used in a piecewise fashion to define irregular boundaries.
- the patient 14 After the treatment plan is established (but it is not static and can change throughout the course of treatment), the patient 14 returns to the medical facility to receive the radiation dose prescribed for each fraction. Prior to delivery of each fraction, the patient is positioned on the couch 82 and registered or aligned with respect to the treatment delivery system 1 1.
- the patient positioning module 110 provides instructions to the drive system 86 to move the couch 82 or the patient 14 can be manually moved to the new position.
- the patient positioning module 110 can receive data from lasers positioned in the treatment room to provide patient position data with respect to the isocenter of the gantry 18. Based on the data from the lasers, the patient positioning module 110 provides instructions to the drive system 86, which moves the couch 82 to achieve proper alignment of the patient 14 with respect to the gantry 18. It is noted that devices and systems, other than lasers, can be used to provide data to the patient positioning module 110 to assist in the alignment process.
- a daily pre-treatment image (e.g., a 3D or volumetric image, sometimes referred to as a fraction image or daily image) is acquired while the patient remains in substantially a treatment position.
- the daily image can be compared to previously acquired images of the patient to identify any changes in the target 38 or other anatomical structures over the course of treatment (e.g., from previously-delivered fractions).
- the changes in the target 38 or other structures is sometimes referred to as deformation.
- Deformation may require that the original treatment plan be modified to account for the deformation.
- the contour module 115 can automatically apply and conform the preexisting contours to take into account the deformation. To do this, a deformation algorithm (discussed below) identifies the changes to the target 38 or other structures. These identified changes are input to the contour module 115, which then modifies the contours based on those changes.
- the software program 90 can also include a deformation module 118 including instructions to deform an image(s) while improving the anatomical significance of the results.
- the deformation of the image(s) can be used to generate a deformation map to identify the differences between one or more of the planning images and one or more of the daily images.
- the deformed image(s) also can be used for registration of the patient 14 and/or to determine or predict a radiation dose to be delivered to the patient 14.
- the deformed image(s) also can be used to determine a radiation dose that the patient 14 received during the prior treatments or fractions.
- the image module 102 is also operable to acquire one or more images of at least a portion of the patient 14 while the patient is receiving radiation treatment that can be deformed to determine a radiation dose that the patient 14 is receiving in real-time.
- the software program 90 also includes a segmentation module 126 for effecting segmentation of the images acquired by the image module 102. Segmentation (discussed below in more detail) is the process of assigning a label to each voxel or at least some of the voxels in one of the images. The label represents the type of tissue present within the voxel. The segmentation is stored as an image (array of voxels). In one embodiment, the segmentation module 126 is operable to segment an image to identify a region of interest defined by at least one anatomic landmark. The segmentation module 126 may be a stand-alone software module or may be integrated with any of the software modules.
- the software program 90 can also include a probability map generation module 130, which includes instructions for generating a probability map based on one or more probabilistic anatomical atlases.
- the probability map generation module 130 uses segmented images of training data to generate probabilistic anatomical atlases. For example, these images of training data are obtained from patients representing a class of the population with tumors located in areas that are to near healthy organs (e.g., the head and neck area, prostate area, etc.). In some embodiments, these healthy organs are essential to the human body (e.g., parotid glands, heart, rectum, etc.).
- the software program 90 can also include an organ identification module 135.
- the organ identification module 135 includes instructions for determining a probability that an organ (e.g., parotid glands or another healthy organ to be avoided during radiation) is located in a region of interest identified by the segmentation module 126. For example, as described in greater detail in the following paragraphs, the organ identification module 135 applies the probability map created by the probability map generation module 130 to a probabilistic classification to determine the probability that the organ in question is located within the determined region of interest.
- FIG. 6 is a flow chart of a method 150 of identifying an anatomical organ in a patient according to an embodiment of the invention.
- the first step in the method is acquiring an image of the patient (at 151).
- the image e.g., a three-dimensional image
- the image includes a plurality of image elements (e.g., voxels).
- the next step includes segmenting the image to identify a region of interest defined by at least one anatomic landmark. This is performed by the segmentation module 126 (at 152).
- the next step involves generating a probability map based on at least one probabilistic anatomical atlas and is performed by the probability map generation module 130 (at 153).
- the method continues by applying the probability map to a probabilistic classification (e.g., situated Bayesian classification) at 154.
- the next step is determining a probability that the organ is located in the identified region of interest. Generally, this is performed by the organ
- a probabilistic classification e.g., situated Bayesian
- the organ identification module further delineates the location of the organ within the region of interest. The following paragraphs describe each of the steps in the method 150 in more detail.
- the segmentation module 126 is configured to segment the images acquired by the image generation module 110.
- the data comprising the patient images are composed of image elements, which represent image elements stored as data in the radiation therapy treatment system. These image elements may be any data construct used to represent image data, including two-dimensional pixels or three-dimensional voxels.
- the voxels are subjected to the segmentation process.
- segmentation categorizes each element as being one of four different substances in the human body. These four substances or tissue types are air, fat, muscle and bone.
- FIG. 7 illustrates a segmentation image through the different steps in the segmentation process.
- the segmentation module 126 can apply a five-layer hierarchy (FIG. 8) of segmentation steps that first analyzes each image element individually (the image element or voxel layer 128), then analyzes neighborhoods or groups of image elements collectively (the neighborhood layer 132), organizes them into tissue groups (the tissue layer 136), then organs (the organ layer 140), and finally organ systems (the systems layer 144). During that process, the segmentation module 126 computes or defines a region of interest based on the provided image. As shown in FIG. 7, the image can be initially segmented into couch, body, and background (left image on FIG. 7). Then, the image is segmented into five groups of tissue - air, fat, muscle, bone, and skin (center image).
- the last image of FIG. 7 shows segmentation into different organs (parotid glands, cranium, brainstem, CI, dens, mandible, pharynx, and teeth) with a region of interest for the parotid glands.
- the 5-layer hierarchy of steps combines rule-based, atlas-based and mesh-based approaches to segmentation in order to achieve both recognition and delineation of anatomical structures, thereby defining the complete image as well as the details within the image.
- the region of interest is computed relative to anatomic landmarks.
- the parotid glands are expected to reside in the space bordered superiorly by the middle ear and zygomatic arch, and inferiorly by the bottom reaches of the mandible body.
- the medial boundaries are carved by the styloid process of the temporal bone, while the lateral boundary is the skin.
- the anterior limits are established by the fact that the parotid glands lies along the masseter, and it wraps around the posterior tip of the mandible ramus.
- the posterior boundary is buttressed by the sternocleidomastoid and the mastoid process.
- the probability map generation module 130 uses the region of interest to generate an atlas-based probability map based on at least one probabilistic anatomical atlas.
- a probabilistic atlas can be computed from a training set of CT scans (e.g., 10 scans) obtained from patients representing a class of the population with tumors located in areas near vital healthy organs (e.g., the head and neck area, prostate, etc.).
- the system 10 deforms each of the probabilistic anatomical atlases to the incoming scan (i.e., to the image obtained by the image generation module 110) to create the atlas-based probability map (see FIG. 9).
- these ten deformed atlases can be combined into a single probability map by using the STAPLE (Simultaneous Truth and Performance Level Estimation) method.
- STAPLE Simultaneous Truth and Performance Level Estimation
- alternative methods for combining the deformed atlases can be sued. Consequently, the probability map is based on the acquired patient image and is different for every person. As it will be described in more detail below, the map is then restricted by the region of interest and applied to a probabilistic classification to determine a probability that an organ (e.g., parotid glands) is located in the identified region of interest.
- organ e.g., parotid glands
- this atlas-based probability map is referred to as a broad prior.
- the immediate prior is computed by the system during the segmentation of each scan.
- the immediate prior is formed by smoothing and normalizing the binary mask segmentation of the parotid gland on the immediately neighboring slice.
- the first slice segmented is the slice on which CI (the first cervical vertebrae) has the largest span, which is the slice depicted in FIG. 7. It is known that the parotid glands can be expected to have a large, if not the largest, cross-sectional area on this particular slice. From that point, segmentation propagates in both directions, inferior and superior, always using the previously visited slice to compute the immediate prior.
- the deformation module 118 deforms the probabilistic atlases before applying the probability map to probabilistic classification. It is to be understood that the deformation processes described below represent only exemplary processes of deforming an image, and thus other types of deformations are possible.
- deformation includes the Insight Toolkit's (ITK) implementation of affine registration, followed by ITK's demons algorithm for free-form deformation.
- ITK Insight Toolkit's
- the output of the deformation process is a 3D warp field for each case. If a warp field is applied to a binary mask, such as the manually delineated parotid glands included in the training set, then the warped mask would suffer from fragmentation and holes.
- the warp field is applied not to the masks, but instead to 3D meshes generated from each mask by using the method of marching cubes. After the mesh vertices are displaced, the masks are regenerated using ITK's rasterization filter. The resultant masks are accumulated and normalized to form the probability map shown in FIG. 9. Since there are only 10 discrete levels owing to the size of the training set, the map is smoothed by convolution with a 3D Gaussian kernel. In another embodiment, in order to apply the atlas to the test data (i.e., the patient image), the average of the 10 deformed training scans is deformed, again using ITK, to each of the test scans. The resultant warp field is also applied to the probability map.
- the test data i.e., the patient image
- the system 10 uses a Marionette method of image deformation to deform the probabilistic atlases to the incoming scan.
- FIG. 10 illustrates an image with manual contours overlaid (left), the atlas contours overlaid prior to deformation (center), and after deformation by the Marionette method (right).
- Deformation is used to form the atlas-based probability map by applying the probabilistic atlases to the current scan (i.e., image) of the patient.
- a shape of an object i.e., organ
- the Marionette method permits only a few anatomical motions.
- the reference image (i.e., the atlas) and the patient image are both segmented automatically. These segmentations are then analyzed to divine the values of the few parameters that govern the allowable motions.
- a deformation field is generated directly without iteration. This field is then passed into a pure free-form deformation process in order to account for any motion not captured by the model.
- this approach manipulates parameters that tilt, swivel, and nod the cranium, swing the mandible, and shrink/expand fatty tissue to account for weight loss. Further, this method allows the cranium and mandible to scale in size.
- the Marionette method manipulates a shape model associated with some of the organs in the human body. Instead of labeling the voxels in the atlas, the method positions the control points for a surface and deforms the control points when the image is deformed. Thus, the control points are connected and used to draw the surface of the new image.
- the Marionette method defines or labels a surface and does not label a region.
- the organ identification module 135 applies the created probability map to a probabilistic classification to determine the probability that the organ in question (e.g., parotid glands or another healthy organ to be avoided during radiation) is located within the determined region of interest and to delineate the location of the organ within the region of interest.
- the probabilistic classifier looks into the region of interest defined during segmentation and into the probability map defined by the probabilistic atlases to define the organ in question.
- the probabilistic classification is situated Bayesian classification.
- the Bayesian classification described below represents only one exemplary probabilistic classification, and thus other types of probabilistic classification can be used.
- the situated Bayesian classification method uses two types of probabilities to confirm the location of the organ in question - likelihood probability and prior or locational probability.
- the likelihood probability is based on the brightens or intensity of the CT pixel value of the anatomical structure (e.g., bright structure represents bone and dark structure represents air).
- the prior or locational probability is defined by the output of the multi-atlas probability map and represents the likelihood that the object is present at a certain location. In other words, the locational probability represents what one would have known without even looking at the CT pixel value.
- the combination of these two probabilities delivers the final result of the Bayesian classification - the probability that the organ is at the by the location defined by region of interest.
- the likelihood probability p(d ⁇ h) is the probability of CT intensity data d given a tissue class hypothesis h.
- Gaussian forms are assumed for each likelihood and the governing parameters are estimated with an adaptive classifier.
- the statistics of CT values within the manually delineated parotid glands of the training data are measured, and the following equations are derived to express the parameters for the parotid tissue class in terms of the parameters of the fat and muscle tissue classes that are discovered:
- the Bayesian classifier computes a posteriori probability for each of the three tissue classes (fat, parotid, muscle) within the region of interest, and selects the class associated with the maximum a posteriori (MAP) probability, as expressed in the following equation:
- K AP ar B max p(d ⁇ h) * (p B (h) * b + p ⁇ (h) * (1 - b))
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- Physics & Mathematics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Radiology & Medical Imaging (AREA)
- Optics & Photonics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Probability & Statistics with Applications (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Biology (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- High Energy & Nuclear Physics (AREA)
- Theoretical Computer Science (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Biophysics (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
A system and method of identifying an anatomical organ in a patient. The method includes acquiring an image of the patient, the image including a plurality of image elements; segmenting the image to identify a region of interest defined by at least one anatomic landmark; generating a probability map based on at least one probabilistic anatomical atlas; and applying the probability map to a probabilistic classification to determine a probability that the organ is located in the region of interest.
Description
SYSTEM AND METHOD FOR IDENTIFYING AN
ANATOMICAL ORGAN IN A PATIENT
RELATED APPLICATIONS
[0001] This application claims priority to U.S. Provisional Patent Application No.
61/400,162, filed on July 23, 2010, titled AUTOMATIC SEGMENTATION OF THE PAROTID GLANDS BY SITUATED BAYESIAN CLASSIFICATION, the entire contents of which are incorporated herein by reference.
BACKGROUND OF THE INVENTION
[0002] Image-guided radiation therapy ("IGRT") employs a radiation therapy treatment delivery system to provide radiation therapy to a patient. IGRT uses cross-sectional images of the patient's internal anatomy to better target the radiation dose to the tumor while reducing the radiation exposure to healthy organs. The radiation dose delivered to the tumor is controlled with intensity modulated radiation therapy ("IMRT"), which involves changing the size, shape, and intensity of the radiation beam to conform to the size, shape, and location of the patient's tumor. IGRT and IMRT lead to improved control of the radiation delivered to the tumor while reducing the potential for acute side effects due to irradiation of healthy tissue surrounding the tumor.
SUMMARY OF THE INVENTION
[0003] Radiation therapy typically utilizes x-ray energies in the range of 1-18 megavolts (MV). This is in contrast to imaging a patient for diagnostic purposes. The radiation delivered to a patient for diagnostic images typically utilizes x-ray energies that are less than 1 MV.
[0004] Radiation therapy planning aims to maximize the dose gradient between the target volumes of radiation and organs at risk (OAR), which are healthy organs located near the tumor. This process involves delineating the boundaries of these organs, which is usually accomplished through a blend of manual contouring and editing of automatically generated contours. For example, contouring cancers of the head and neck (H&N) demands a more meticulous attention to detail than does most other regions. The OAR in the H&N exhibit non-convex forms, such that manual drawing requires several hours of tedious work.
[0005] Therefore, radiation therapy planning must attempt to avoid radiation to these healthy organs (e.g., parotid glands, heart, rectum, etc.) while delivering radiation to the tumor. For example, with the exception of tumors positioned low in the neck, parotid glands are a prominent pair of OAR that appear in roughly 95% of H&N contoured treatment plans. Each parotid gland consists of a superficial lobe and a deep lobe connected by a narrow isthmus, yielding an asymmetric shape. The parotid glands are responsible for producing saliva, and excessive radiation of the parotid glands can cause xerostomia and other conditions that may adversely affect a patient's quality of life.
[0006] Sometimes it is very difficult to exactly identify healthy organs during the radiation treatment process. For example, when imaged by computed tomography (CT), the parotid tissue appears between fat and muscle in intensity, but the range is much broader, partially overlapping both fat and muscle. Some reasons for this are that the parotid glands gain more fat with age, and they are vascularized so that tiny bright vessel spots appear inside as the external carotid artery runs through, and bifurcates within, the interior of the parotid gland. Therefore, it is important that such organs are identified during the radiation therapy planning process.
[0007] The invention relates to a system and method for identifying an anatomical organ in a patient. In one embodiment, the invention provides a method of identifying an anatomical organ in a patient. The method includes acquiring an image of the patient, the image including a plurality of image elements; segmenting the image to identify a region of interest defined by at least one anatomic landmark; generating a probability map based on at least one probabilistic anatomical atlas; and applying the probability map to a probabilistic classification to determine a probability that the organ is located in the region of interest.
[0008] In another embodiment, the invention provides a computer program embodied by a computer readable medium capable of being executed by a computer, the computer program for use in a radiation therapy treatment system. The computer program includes an image generation module operable to acquire an image of a patient, the image including a plurality of image elements; a segmentation module operable to segment the image to identify a region of interest defined by at least one anatomic landmark; a probability map generation module operable to generate a probability map based on at least one probabilistic anatomical atlases; and
an organ identification module operable to apply the probability map to a probabilistic classification to determine a probability that an organ is located in the region of interest.
[0009] In still another embodiment, the invention provides a radiation therapy treatment system for identifying an anatomic organ in a patient. The radiation therapy treatment system includes an image acquisition device operable to acquire an image of the patient, the image including a plurality of image elements; a processor; and a computer readable medium storing non-transitory programmed instructions. When executed by the processor the non-transitory programmed instructions segment the image to identify a region of interest defined by at least one anatomic landmark, generate a probability map based on at least one probabilistic anatomical atlas, and apply the probability map to a probabilistic classification to determine a probability that the organ is located in the region of interest.
[0010] In yet another embodiment, the invention provides a radiation therapy treatment system for identifying an anatomic organ in a patient. The radiation therapy treatment system includes a treatment delivery system and a treatment planning system. The treatment delivery system includes a gantry, a radiation source supported by the gantry, a detector supported by the gantry and positioned opposite the radiation source, and a couch to support a patient. The treatment planning system includes a computer processor configured to receive instructions from at least one of an image module, a treatment plan module, a contour module, a patient positioning module, a deformation module, a segmentation module, a probability map generation module, and an organ identification module. These modules, in conjunction with the computer processor, operate to process data, manipulate data, analyze data, and format data for display. In particular, these modules generate and provide instructions to the processor to segment the image to identify a region of interest defined by at least one anatomic landmark, generate a probability map based on at least one probabilistic anatomical atlas, and apply the probability map to a probabilistic classification to determine a probability that the organ is located in the region of interest.
[0011] Other aspects of the invention will become apparent by consideration of the detailed description and accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
[0013] FIG. 1 is a schematic diagram of a radiation therapy treatment system.
[0014] FIG. 2 is a perspective view of the radiation therapy treatment system.
[0015] FIG. 3 is a perspective view of a multi-leaf collimator that can be used in the radiation therapy treatment system illustrated in FIG. 2.
[0016] FIG. 4 is a schematic illustration of the radiation therapy treatment system of FIG. 2.
[0017] FIG. 5 is a schematic diagram of a software program used in the radiation therapy treatment system.
[0018] FIG. 6 is a flow chart of a method of identifying an anatomical organ in a patient according to an embodiment of the invention.
[0019] FIG. 7 illustrates an image segmented into couch, body, and background (left), air, fat, muscle, bone, and skin tissues (center), and different organs (right).
[0020] FIG. 8 is a schematic illustration of the hierarchical steps of a segmentation process embodying the invention.
[0021] FIG. 9 shows three slices of a probability map computed from probabilistic anatomical atlases representing training data.
[0022] FIG. 10 illustrates an image with manual contours overlaid (left), the atlas contours overlaid prior to deformation (center), and after deforming by a Marionette method (right).
DETAILED DESCRIPTION
[0023] Before any embodiments of the invention are explained in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the following drawings. The invention is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of "including," "comprising," or "having" and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Unless specified or limited otherwise, the terms "mounted," "connected," "supported," and "coupled" and variations thereof are used broadly and encompass both direct and indirect mountings, connections, supports, and couplings.
[0024] Although directional references, such as upper, lower, downward, upward, rearward, bottom, front, rear, etc., may be made herein in describing the drawings, these references are made relative to the drawings (as normally viewed) for convenience. These directions are not intended to be taken literally or limit the present invention in any form. In addition, terms such as "first," "second," and "third" are used herein for purposes of description and are not intended to indicate or imply relative importance or significance.
[0025] In addition, it should be understood that embodiments of the invention may include hardware, software, and electronic components or modules that, for purposes of discussion, may be illustrated and described as if the majority of the components were implemented solely in hardware. However, one of ordinary skill in the art, and based on a reading of this detailed description, would recognize that, in at least one embodiment, the electronic based aspects of the invention may be implemented in software (e.g., stored on non-transitory computer-readable medium). As such, it should be noted that a plurality of hardware and software based devices, as well as a plurality of different structural components may be utilized to implement the invention. Furthermore, and as described in subsequent paragraphs, the specific mechanical configurations illustrated in the drawings are intended to exemplify embodiments of the invention and that other alternative mechanical configurations are possible.
[0026] FIG. 1 illustrates a radiation therapy treatment system 10 that provides radiation therapy to a patient 14 according to one embodiment of the present invention. The radiation therapy treatment system 10 includes a treatment delivery system 11 and a treatment planning system 12. The radiation therapy treatment can include photon-based radiation therapy, brachytherapy, electron beam therapy, proton, neutron, particle therapy, or other types of treatment therapy.
[0027] With reference to FIGS. 1-2, the treatment delivery system 11 includes a gantry 18. The gantry 18 supports a radiation module 22, which includes a radiation source 24 and a LINAC 26 that generates a beam 30 of radiation. Although the gantry 18 shown in FIG. 2 is a ring gantry (i.e., it extends through a full 360° arc to create a complete ring or circle), other types of mounting arrangements may also be employed. For example, a C-type, partial ring gantry, or robotic arm gantry arrangement could be used. Any other framework capable of positioning the radiation module 22 at various rotational and/or axial positions relative to the patient 14 may also be employed. In addition, the radiation source 24 may travel in a path that does not follow the shape of the gantry 18. For example, the radiation source 24 may travel in a non-circular path even though the illustrated gantry 18 is generally circular-shaped. The gantry 18 of the illustrated embodiment defines a gantry aperture 32 into which the patient 14 moves during treatment.
[0028] The radiation module 22 also includes a modulation device 34 operable to modify or modulate the radiation beam 30. The modulation device 34 modulates the radiation beam 30 and directs the radiation beam 30 toward the patient 14. Specifically, the radiation beam 30 is directed toward a portion 38 of the patient 14. The portion 38 may include the patient's entire body, but is generally smaller than the patient's entire body and can be defined by a two- dimensional area and/or a three-dimensional volume. A portion may include one or more regions of interest. For example, a portion desired to receive the radiation, which may be referred to as a target 38 or target region, is an example of a region of interest. Another type of region of interest is a region at risk. If a portion includes a region at risk, the radiation beam 30 is preferably diverted from the region at risk. The patient 14 may also have more than one target region that needs to receive radiation therapy. Such modulation is sometimes referred to as intensity modulated radiation therapy ("IMRT").
[0029] The modulation device 34 can include a collimation device 42 as illustrated in FIG. 2. The collimation device 42 includes a set of jaws 46 that define and adjust the size of an aperture 50 through which the radiation beam 30 may pass. The jaws 46 include an upper jaw 54 and a lower jaw 58. The upper jaw 54 and the lower jaw 58 are moveable to adjust the size of the aperture 50.
[0030] In one embodiment, and illustrated in FIG. 3, the modulation device 34 can comprise a multi-leaf collimator 62, which includes a plurality of interlaced leaves 66 operable to move from a first position to a second position, to provide intensity modulation of the radiation beam 30. It is also noted that the leaves 66 can move to a position anywhere between a minimally and maximally-open position. The plurality of interlaced leaves 66 modulate the strength, size, and shape of the radiation beam 30 before the radiation beam 30 reaches the target 38 on the patient 14. Each of the leaves 66 is independently controlled by an actuator 70, such as a motor or an air valve so that the leaf 66 can open and close quickly to permit or block the passage of radiation. The actuators 70 can be controlled by a computer 74 and/or controller.
[0031] With further reference to FIG. 1, the treatment delivery system 11 can also include a detector 78, e.g., a kilovoltage or a megavoltage detector, that receives the radiation beam 30. The linear accelerator 26 and the detector 78 can also operate as a computed tomography (CT) system to generate CT images of the patient 14. The linear accelerator 26 emits the radiation beam 30 toward the target 38 in the patient 14. The target 38 absorbs some of the radiation. The detector 78 detects or measures the amount of radiation absorbed by the target 38. The detector 78 collects the absorption data from different angles as the linear accelerator 26 rotates around and emits radiation toward the patient 14. The collected absorption data is transmitted to the computer 74 for processing of the absorption data and generating images of the patient's body tissues and organs. The images can also illustrate bone, soft tissues, and blood vessels.
[0032] The CT images can be acquired with a radiation beam 30 that has a fan-shaped geometry, a multi-slice geometry or a cone -beam geometry. In addition, the CT images can be acquired with the linear accelerator 26 delivering megavoltage energies or kilovoltage energies. It is also noted that the acquired CT images can be registered with previously acquired CT images (from the radiation therapy treatment system 10 or other image acquisition devices, such
as other CT scanners, MRI systems, and PET systems). For example, the previously acquired CT images for the patient 14 can include identified targets 38 made through a contouring process. The newly acquired CT images for the patient 14 can be registered with the previously acquired CT images to assist in identifying the targets 38 in the new CT images. The registration process can use rigid or deformable registration tools.
[0033] The image data can be presented on a display as either a three-dimensional image or a series of two-dimensional images. In addition, the image data comprising the images can be either voxels (for three-dimensional images) or pixels (for two-dimensional images). The term image element is used generally in the description to refer to both.
[0034] The treatment delivery system 11 also includes a patient support device, such as a couch 82 (illustrated in FIG. 1), which supports the patient 14. The couch 82, or at least portions thereof, moves into and out of the field of radiation along an axis 84. The couch 82 is also capable of moving along the X and Z axes as illustrated in FIG. 1. In other embodiments of the invention, the patient support can be a device that is adapted to support any portion of the patient's body. The patient support is not limited to having to support the entire patient's body. The system 11 also can include a drive system 86 operable to manipulate the position of the couch 82. The drive system 86 can be controlled by the computer 74.
[0035] The treatment planning system 12 includes the computer 74, which is embodied as an operator station to be accessed by medical personnel. The computer 74 includes a controller 75, a user interface module 76, a display 77, and a communications module 79. The controller 75 and the user interface module 76 include combinations of software and hardware that are operable to, among other things, control the operation of the treatment delivery system 11 and the information that is presented on the display 77.
[0036] The controller 75 includes, for example, a processing unit 80 (e.g., a microprocessor, a microcontroller, or another suitable programmable device), a memory 81, and a bus 83. The bus 83 connects various components of the controller 75, including the memory 81, to the processing unit 80. The processing unit 80 may represent one or more general-purpose processors, a special purpose processor such as a digital signal processor or other type of device such as a controller or field programmable gate array.
[0037] It should be understood that although the controller 75, the user interface module 76, the display 77, and the communications module 79 are illustrated as part of a single server or computing device, the components of the treatment planning system 12 can be distributed over multiple servers or computing devices. Similarly, the treatment planning system 12 can include multiple controllers 75, user interface modules 76, displays 77, and communications modules 79,
[0038] The memory 81 includes, for example, a read-only memory ("ROM"), a random access memory ("RAM"), an electrically erasable programmable read-only memory
("EEPROM"), a flash memory, a hard disk, an SD card, or another suitable magnetic, optical, physical, or electronic memory device. The processing unit 80 is connected to the memory 81 and executes software program 90 that is capable of being stored in the RAM (e.g., during execution), the ROM (e.g., on a generally permanent basis), or another non-transitory computer readable medium such as another memory or a disc. Additionally or alternatively, the memory 81 is included in the processing unit 80. The controller 75 also includes an input/output ("I/O") system 85 that includes routines for transferring information between components within the controller 75 and other components of the treatment planning system 12. Software included in the implementation of the treatment planning system 12 is stored in the memory 81 of the controller75. The software includes, for example, firmware, one or more applications, program data, one or more program modules, and other executable instructions. The controller 75 is configured to retrieve from memory and execute, among other things, instructions related to the methods described below.
[0039] The user interface module 76 is configured for user control of the treatment planning system 12. For example, the user interface module 76 is operably coupled to the controller 75 to control the information presented on the display 77. The user interface module 76 can include a combination of digital and analog input or output devices required to achieve a desired level of control for the treatment planning system 12. For example, the user interface module 76 can include input devices such as a touch-screen display, a plurality of knobs, a plurality of dials, a plurality of switches, a plurality of buttons, or the like.
[0040] The display 77 is, for example, a liquid crystal display ("LCD"), a light-emitting diode ("LED") display, an organic LED ("OLED") display, an electroluminescent display
("ELD"), a surface-conduction electron-emitter display ("SED"), a field emission display ("FED"), a thin-film transistor ("TFT") LCD, or the like. In other constructions, the display 77 is a Super active-matrix OLED ("AMOLED") display.
[0041] In some implementations, the treatment planning system 12 is also configured to connect to a network (e.g., a WAN, a LAN, or the like) via the communications module 79 to access other programs, software, or treatment planning systems 12, or treatment delivery systems 11. The communications module 79 can include a network interface, such as an Ethernet card or a wireless network card, that allows the treatment planning system 12 to send and receive information over a network, such as a local area network or the Internet. In some embodiments, the communications module 79 includes drivers configured to receive and send data to and from various input and/or output devices, such as a keyboard, a mouse, a printer, etc. Data
communications can occur via a wireless local area network ("LAN") using any of a variety of communications protocols, such as Wi-Fi, Bluetooth, ZigBee, or the like. Additionally or alternatively, data communications can occur over a wide area network ("WAN") (e.g., a TCP/IP based network or the like).
[0042] The communications module 79 is also compatible with the Digital Imaging and Communications in Medicine (DICOM) protocol with any version and/or other required protocol. DICOM is an international communications standard developed by NEMA that defines the format used to transfer medical image-related data between different pieces of medical equipment. DICOM RT refers to the standards that are specific to radiation therapy data.
[0043] The two-way arrows in FIGS. 1 and 4 generally represent two-way communication and information transfer where indicated. However, for some medical and computerized equipment, only one-way communication and information transfer may be necessary.
[0044] The processing unit 80 executes instructions stored in the computer-readable media. The instructions can include various components or modules configured to perform particular functionality when executed by the processing unit 80. For example, the computer-readable media includes a treatment planning process application that interacts with the user interface module 76 to display on the display 77 various "screens" or "pages" related to the patient's
treatment plan. As such, all of the screens of the user interface are not limited to the arrangement as shown in any of the drawings. The screens may include, but are not limited to fields, columns, rows, dialog boxes, tabs, buttons, radio buttons, and drop down menus. Field titles may vary and are not limited to that shown in the drawings.
[0045] The treatment planning system 12 can represent a server that hosts the treatment planning process application as a network-based tool or application. Therefore, a user can access the treatment planning process application through a network, such as the Internet. Accordingly, in some embodiments, a user is not required to have the treatment planning process application permanently installed on the computer 74. Rather, the user can access the treatment planning process application using a browser application, such as Internet Explorer®.
[0046] The software program 90 includes a plurality of modules that interact or communicate with one another to provide instructions to the processing unit for generating a treatment plan for a patient, modifying or adapting a treatment plan, acquiring images of the patient, and controlling the components of the treatment delivery system 11. The software program 90 includes an image module 102 operable to acquire or receive images of at least a portion of the patient 14. The image module 102 can generate instructions for the on-board or on-line image device, such as a CT imaging device to acquire images of the patient 14 before treatment commences, during treatment, and after treatment according to desired protocols. For example, the on-board or on-line image device can comprise the radiation source and the detector, where the radiation source delivers kV or MV radiation to the patient that is collected by the detector and processed into a three-dimensional (e.g., CT) image. For CT images, the data comprising the patient images are composed of image elements, which represent image elements stored as data in the radiation therapy treatment system. These image elements may be any data construct used to represent image data, including two-dimensional pixels or three-dimensional voxels. The images can be stored in memory or in a database and retrieved by the image module 102 for later use.
[0047] In one aspect, the image module 102 acquires an image of the patient 14 while the patient 14 is substantially in a treatment position. Other off-line imaging devices or systems may be used to acquire pre-treatment images (e.g., three-dimensional) of the patient 14, such as non-
quantitative CT, MRI, PET, SPECT, ultrasound, transmission imaging, fluoroscopy, RF-based localization, and the like. The acquired images can be used for registration/alignment of the patient 14 with respect to the gantry or other point and/or to determine or predict a radiation dose to be delivered to the patient 14. The acquired images also can be used to generate a
deformation map to identify the differences between one or more of the planning images and one or more of the pre-treatment (e.g., a daily image), during-treatment, or after-treatment images. The acquired images also can be used to determine a radiation dose that the patient 14 received during the prior treatments. The image module 102 also is operable to acquire images of at least a portion of the patient 14 while the patient is receiving treatment to determine a radiation dose that the patient 14 is receiving in real-time.
[0048] The software program 90 also includes a treatment plan module 106 including instructions for generating a treatment plan for the patient 14 based on data input to the system 10 by medical personnel, retrieved from memory or a database, received from other modules, or otherwise acquired by the system 10. The data can include one or more images (e.g., planning images and/or pre-treatment images) of at least a portion of the patient 14. These images may be acquired and processed by the image module 102 in the manner described in the preceding paragraphs. The treatment plan module 106 can separate the treatment plan into a plurality of treatment fractions and can determine the radiation dose for each fraction or treatment based on a radiation dose prescription input by medical personnel. The treatment plan module 106 can communicate with a contour module 115, which includes instructions for generating various contours to be drawn around the target 38. Medical personnel can utilize the contour module 115 via the user interface module 76 to contour or identify particular areas on and/or around the target 38 and to specify the amount of radiation dose for each of the areas.
[0049] During the treatment planning process or between one or more of the treatment sessions (e.g., delivery of a fraction), medical personnel can utilize one or more of the images to generate one or more contours on the one or more images to identify one or more treatment regions or avoidance regions of the target 38. The contour process can include using geometric shapes, including three-dimensional shapes to define the boundaries of the treatment region of the target 38 that will receive radiation and/or the avoidance region of the target 38 that will receive minimal or no radiation. The medical personnel can use a plurality of predefined
geometric shapes to define the treatment region(s) and/or the avoidance region(s). The plurality of shapes can be used in a piecewise fashion to define irregular boundaries.
[0050] After the treatment plan is established (but it is not static and can change throughout the course of treatment), the patient 14 returns to the medical facility to receive the radiation dose prescribed for each fraction. Prior to delivery of each fraction, the patient is positioned on the couch 82 and registered or aligned with respect to the treatment delivery system 1 1.
[0051] If the patient's position needs to be adjusted, the patient positioning module 110 provides instructions to the drive system 86 to move the couch 82 or the patient 14 can be manually moved to the new position. In one construction, the patient positioning module 110 can receive data from lasers positioned in the treatment room to provide patient position data with respect to the isocenter of the gantry 18. Based on the data from the lasers, the patient positioning module 110 provides instructions to the drive system 86, which moves the couch 82 to achieve proper alignment of the patient 14 with respect to the gantry 18. It is noted that devices and systems, other than lasers, can be used to provide data to the patient positioning module 110 to assist in the alignment process.
[0052] After the patient is properly positioned, a daily pre-treatment image (e.g., a 3D or volumetric image, sometimes referred to as a fraction image or daily image) is acquired while the patient remains in substantially a treatment position. The daily image can be compared to previously acquired images of the patient to identify any changes in the target 38 or other anatomical structures over the course of treatment (e.g., from previously-delivered fractions). The changes in the target 38 or other structures is sometimes referred to as deformation.
Deformation may require that the original treatment plan be modified to account for the deformation. Instead of having to recontour the target 38 or the other structures, the contour module 115 can automatically apply and conform the preexisting contours to take into account the deformation. To do this, a deformation algorithm (discussed below) identifies the changes to the target 38 or other structures. These identified changes are input to the contour module 115, which then modifies the contours based on those changes.
[0053] Therefore, the software program 90 can also include a deformation module 118 including instructions to deform an image(s) while improving the anatomical significance of the
results. The deformation of the image(s) can be used to generate a deformation map to identify the differences between one or more of the planning images and one or more of the daily images.
[0054] The deformed image(s) also can be used for registration of the patient 14 and/or to determine or predict a radiation dose to be delivered to the patient 14. The deformed image(s) also can be used to determine a radiation dose that the patient 14 received during the prior treatments or fractions. The image module 102 is also operable to acquire one or more images of at least a portion of the patient 14 while the patient is receiving radiation treatment that can be deformed to determine a radiation dose that the patient 14 is receiving in real-time.
[0055] The software program 90 also includes a segmentation module 126 for effecting segmentation of the images acquired by the image module 102. Segmentation (discussed below in more detail) is the process of assigning a label to each voxel or at least some of the voxels in one of the images. The label represents the type of tissue present within the voxel. The segmentation is stored as an image (array of voxels). In one embodiment, the segmentation module 126 is operable to segment an image to identify a region of interest defined by at least one anatomic landmark. The segmentation module 126 may be a stand-alone software module or may be integrated with any of the software modules.
[0056] In one embodiment, the software program 90 can also include a probability map generation module 130, which includes instructions for generating a probability map based on one or more probabilistic anatomical atlases. As discussed in more details below, the probability map generation module 130 uses segmented images of training data to generate probabilistic anatomical atlases. For example, these images of training data are obtained from patients representing a class of the population with tumors located in areas that are to near healthy organs (e.g., the head and neck area, prostate area, etc.). In some embodiments, these healthy organs are essential to the human body (e.g., parotid glands, heart, rectum, etc.).
[0057] The software program 90 can also include an organ identification module 135. The organ identification module 135 includes instructions for determining a probability that an organ (e.g., parotid glands or another healthy organ to be avoided during radiation) is located in a region of interest identified by the segmentation module 126. For example, as described in greater detail in the following paragraphs, the organ identification module 135 applies the
probability map created by the probability map generation module 130 to a probabilistic classification to determine the probability that the organ in question is located within the determined region of interest.
[0058] FIG. 6 is a flow chart of a method 150 of identifying an anatomical organ in a patient according to an embodiment of the invention. The first step in the method is acquiring an image of the patient (at 151). In some embodiments, the image (e.g., a three-dimensional image) is acquired by the image module 102. The image includes a plurality of image elements (e.g., voxels). The next step includes segmenting the image to identify a region of interest defined by at least one anatomic landmark. This is performed by the segmentation module 126 (at 152). The next step involves generating a probability map based on at least one probabilistic anatomical atlas and is performed by the probability map generation module 130 (at 153). The method continues by applying the probability map to a probabilistic classification (e.g., situated Bayesian classification) at 154. The next step is determining a probability that the organ is located in the identified region of interest. Generally, this is performed by the organ
identification module 135 (at 155). In some embodiments, the organ identification module further delineates the location of the organ within the region of interest. The following paragraphs describe each of the steps in the method 150 in more detail.
[0059] As mentioned above, the segmentation module 126 is configured to segment the images acquired by the image generation module 110. For CT images, the data comprising the patient images are composed of image elements, which represent image elements stored as data in the radiation therapy treatment system. These image elements may be any data construct used to represent image data, including two-dimensional pixels or three-dimensional voxels. In order to accurately analyze the patient images, the voxels are subjected to the segmentation process. In particular, segmentation categorizes each element as being one of four different substances in the human body. These four substances or tissue types are air, fat, muscle and bone. FIG. 7 illustrates a segmentation image through the different steps in the segmentation process.
[0060] The segmentation module 126 can apply a five-layer hierarchy (FIG. 8) of segmentation steps that first analyzes each image element individually (the image element or voxel layer 128), then analyzes neighborhoods or groups of image elements collectively (the
neighborhood layer 132), organizes them into tissue groups (the tissue layer 136), then organs (the organ layer 140), and finally organ systems (the systems layer 144). During that process, the segmentation module 126 computes or defines a region of interest based on the provided image. As shown in FIG. 7, the image can be initially segmented into couch, body, and background (left image on FIG. 7). Then, the image is segmented into five groups of tissue - air, fat, muscle, bone, and skin (center image). The last image of FIG. 7 shows segmentation into different organs (parotid glands, cranium, brainstem, CI, dens, mandible, pharynx, and teeth) with a region of interest for the parotid glands.
[0061] The 5-layer hierarchy of steps combines rule-based, atlas-based and mesh-based approaches to segmentation in order to achieve both recognition and delineation of anatomical structures, thereby defining the complete image as well as the details within the image.
Additional information regarding segmentation can be found in co-pending U.S. Patent
Application No. 12/380,829, the contents of which are incorporated herein by reference. It is to be understood that the segmentation process described above and illustrated in FIG. 8 represents only one exemplary process of segmenting an image, and thus other types of segmentation are possible.
[0062] During the segmentation process, the region of interest is computed relative to anatomic landmarks. For example, the parotid glands are expected to reside in the space bordered superiorly by the middle ear and zygomatic arch, and inferiorly by the bottom reaches of the mandible body. The medial boundaries are carved by the styloid process of the temporal bone, while the lateral boundary is the skin. The anterior limits are established by the fact that the parotid glands lies along the masseter, and it wraps around the posterior tip of the mandible ramus. The posterior boundary is buttressed by the sternocleidomastoid and the mastoid process.
[0063] After segmentation of the patient's image is completed and the region of interest is defined, the probability map generation module 130 uses the region of interest to generate an atlas-based probability map based on at least one probabilistic anatomical atlas. For example, a probabilistic atlas can be computed from a training set of CT scans (e.g., 10 scans) obtained from patients representing a class of the population with tumors located in areas near vital healthy organs (e.g., the head and neck area, prostate, etc.). The system 10 deforms each of the
probabilistic anatomical atlases to the incoming scan (i.e., to the image obtained by the image generation module 110) to create the atlas-based probability map (see FIG. 9). For example, these ten deformed atlases can be combined into a single probability map by using the STAPLE (Simultaneous Truth and Performance Level Estimation) method. In other embodiments, alternative methods for combining the deformed atlases can be sued. Consequently, the probability map is based on the acquired patient image and is different for every person. As it will be described in more detail below, the map is then restricted by the region of interest and applied to a probabilistic classification to determine a probability that an organ (e.g., parotid glands) is located in the identified region of interest.
[0064] In some embodiments, this atlas-based probability map is referred to as a broad prior. Another type of prior, the immediate prior is computed by the system during the segmentation of each scan. For example, the immediate prior is formed by smoothing and normalizing the binary mask segmentation of the parotid gland on the immediately neighboring slice. The first slice segmented is the slice on which CI (the first cervical vertebrae) has the largest span, which is the slice depicted in FIG. 7. It is known that the parotid glands can be expected to have a large, if not the largest, cross-sectional area on this particular slice. From that point, segmentation propagates in both directions, inferior and superior, always using the previously visited slice to compute the immediate prior.
[0065] In one embodiment, the deformation module 118 deforms the probabilistic atlases before applying the probability map to probabilistic classification. It is to be understood that the deformation processes described below represent only exemplary processes of deforming an image, and thus other types of deformations are possible. In one embodiment, deformation includes the Insight Toolkit's (ITK) implementation of affine registration, followed by ITK's demons algorithm for free-form deformation. The output of the deformation process is a 3D warp field for each case. If a warp field is applied to a binary mask, such as the manually delineated parotid glands included in the training set, then the warped mask would suffer from fragmentation and holes. Therefore, the warp field is applied not to the masks, but instead to 3D meshes generated from each mask by using the method of marching cubes. After the mesh vertices are displaced, the masks are regenerated using ITK's rasterization filter. The resultant masks are accumulated and normalized to form the probability map shown in FIG. 9. Since there
are only 10 discrete levels owing to the size of the training set, the map is smoothed by convolution with a 3D Gaussian kernel. In another embodiment, in order to apply the atlas to the test data (i.e., the patient image), the average of the 10 deformed training scans is deformed, again using ITK, to each of the test scans. The resultant warp field is also applied to the probability map.
[0066] In other embodiments, the system 10 uses a Marionette method of image deformation to deform the probabilistic atlases to the incoming scan. FIG. 10 illustrates an image with manual contours overlaid (left), the atlas contours overlaid prior to deformation (center), and after deformation by the Marionette method (right). Deformation is used to form the atlas-based probability map by applying the probabilistic atlases to the current scan (i.e., image) of the patient. A shape of an object (i.e., organ) can be defined by several thousand control points or by a very small number of control points (e.g., 10-15). Instead of allowing each image voxel in the atlas to move in any direction, the Marionette method permits only a few anatomical motions. The reference image (i.e., the atlas) and the patient image are both segmented automatically. These segmentations are then analyzed to divine the values of the few parameters that govern the allowable motions. Given these model parameters, a deformation field is generated directly without iteration. This field is then passed into a pure free-form deformation process in order to account for any motion not captured by the model. Analogous to strings on a marionette, this approach manipulates parameters that tilt, swivel, and nod the cranium, swing the mandible, and shrink/expand fatty tissue to account for weight loss. Further, this method allows the cranium and mandible to scale in size.
[0067] The Marionette method manipulates a shape model associated with some of the organs in the human body. Instead of labeling the voxels in the atlas, the method positions the control points for a surface and deforms the control points when the image is deformed. Thus, the control points are connected and used to draw the surface of the new image. In general terms, the Marionette method defines or labels a surface and does not label a region.
[0068] Next, the organ identification module 135 applies the created probability map to a probabilistic classification to determine the probability that the organ in question (e.g., parotid glands or another healthy organ to be avoided during radiation) is located within the determined
region of interest and to delineate the location of the organ within the region of interest. In other words, the probabilistic classifier looks into the region of interest defined during segmentation and into the probability map defined by the probabilistic atlases to define the organ in question. In one embodiment, the probabilistic classification is situated Bayesian classification. However, it is to be understood that the Bayesian classification described below represents only one exemplary probabilistic classification, and thus other types of probabilistic classification can be used.
[0069] In one embodiment, the situated Bayesian classification method uses two types of probabilities to confirm the location of the organ in question - likelihood probability and prior or locational probability. The likelihood probability is based on the brightens or intensity of the CT pixel value of the anatomical structure (e.g., bright structure represents bone and dark structure represents air). The prior or locational probability is defined by the output of the multi-atlas probability map and represents the likelihood that the object is present at a certain location. In other words, the locational probability represents what one would have known without even looking at the CT pixel value. The combination of these two probabilities delivers the final result of the Bayesian classification - the probability that the organ is at the by the location defined by region of interest.
[0070] For example, the likelihood probability p(d\h) is the probability of CT intensity data d given a tissue class hypothesis h. In one embodiment, Gaussian forms are assumed for each likelihood and the governing parameters are estimated with an adaptive classifier. The statistics of CT values within the manually delineated parotid glands of the training data are measured, and the following equations are derived to express the parameters for the parotid tissue class in terms of the parameters of the fat and muscle tissue classes that are discovered:
MeanPamtid = (0.41) * MeanPat + (0.59) * MeanMuscle
* Parotid = (3.7) * SDMmck
[0071] In another embodiment, there are two prior probabilities, pe( ) and pi(h), one broad and one immediate, that combine into a single prior probability p(h). These probabilities are combined by using a blending factor b that specifies the degree to which to favor the broad prior over the immediate prior. This blending factor varies with the degree to which the slice contains
dental artifacts, which threaten the integrity of the data d. The artifacts are detected by fuzzy logic that examines image characteristics in the vicinity of the mandible.
[0072] The Bayesian classifier computes a posteriori probability for each of the three tissue classes (fat, parotid, muscle) within the region of interest, and selects the class associated with the maximum a posteriori (MAP) probability, as expressed in the following equation:
KAP = arB max p(d \ h) * (pB (h) * b + p} (h) * (1 - b))
[0073] Various features and advantages of the invention are set forth in the following claims.
Claims
1. A method of identifying an anatomical organ in a patient, the method comprising:
acquiring an image of the patient, the image including a plurality of image elements; segmenting the image to identify a region of interest defined by at least one anatomic landmark;
generating a probability map based on at least one probabilistic anatomical atlas; and applying the probability map to a probabilistic classification to determine a probability that the organ is located in the region of interest.
2. The method as set forth in claim 1, wherein segmenting the image includes segmenting the image into a plurality of tissue types.
3. The method as set forth in claim 2, wherein the tissue types include air, fat, muscle, bone, and skin.
4. The method as set forth in claim 3, wherein the image is further segmented to identify a plurality of anatomical structures.
5. The method as set forth in claim 4, wherein the probabilistic classification includes situated Bayesian classification.
6. The method as set forth in claim 5, wherein the situated Bayesian classification determines a likelihood probability and a prior probability.
7. The method as set forth in claim 6, wherein the likelihood probability is based on a brightness of the anatomical structures and the prior probability is based on a location of the anatomical structures.
8. The method as set forth in claim 4, applying a set of anatomic guidelines to establish patient- specific boundaries with respect to the identified anatomical structures.
9. The method as set forth in claim 1, wherein the at least one probabilistic anatomical atlas is constructed from training data.
10. The method as set forth in claim 9, wherein the training data includes segmented images from other patients.
11. The method as set forth in claim 1 , further comprising deforming the probabilistic atlas.
12. The method as set forth in claim 11, wherein deforming the probabilistic atlases is performed by a Marionette method of image deformation.
13. The method as set forth in claim 12, further comprising using the Marionette method to manipulate a position of a number of control points of a shape model associated with the organ.
14. The method as set forth in claim 1, wherein each of the image elements comprises a voxel.
15. The method as set forth in claim 1, further comprising delineating the location of the organ within the region of interest.
16. A computer program embodied by a computer readable medium capable of being executed by a computer, the computer program for use in a radiation therapy treatment system, the computer program comprising:
an image generation module operable to acquire an image of a patient, the image including a plurality of image elements;
a segmentation module operable to segment the image to identify a region of interest defined by at least one anatomic landmark;
a probability map generation module operable to generate a probability map based on at least one probabilistic anatomical atlases; and
an organ identification module operable to apply the probability map to a probabilistic classification to determine a probability that an organ is located in the region of interest.
17. The computer program as set forth in claim 16, wherein the segmentation module is operable to segment the image into a plurality of anatomical structures.
18. The computer program as set forth in claim 16, wherein the probabilistic classification includes situated Bayesian classification.
19. The computer program as set forth in claim 18, wherein the situated Bayesian classification determines a likelihood probability and a prior probability.
20. The computer program as set forth in claim 19, wherein the likelihood probability is based on a brightness of the anatomical structures and the prior probability is based on a location of the anatomical structures.
21. The computer program as set forth in claim 16, wherein the organ identification module is operable to delineate the location of the organ within the region of interest.
22. A radiation therapy treatment system for identifying an anatomic organ in a patient, the radiation therapy treatment system comprising:
an image acquisition device operable to acquire an image of the patient, the image including a plurality of image elements;
a processor; and
a computer readable medium storing non-transitory programmed instructions that, when executed by the processor
segment the image to identify a region of interest defined by at least one anatomic landmark,
generate a probability map based on at least one probabilistic anatomical atlas, and apply the probability map to a probabilistic classification to determine a probability that the organ is located in the region of interest.
23. The radiation therapy treatment system as set forth in claim 22, wherein the at least one anatomical atlas is constructed from training data.
24. The radiation therapy treatment system as set forth in claim 23, wherein the training data includes segmented images from other patients.
25. The radiation therapy treatment system as set forth in claim 22, wherein the non- transitory programmed instructions further include instructions that, when executed by the processor deform the at least one probabilistic atlas.
26. The radiation therapy treatment system as set forth in claim 22, wherein the non- transitory programmed instructions further include instructions that, when executed by the processor delineate the location of the organ within the region of interest.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US40016210P | 2010-07-23 | 2010-07-23 | |
US61/400,162 | 2010-07-23 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012012768A1 true WO2012012768A1 (en) | 2012-01-26 |
Family
ID=44629737
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2011/045092 WO2012012768A1 (en) | 2010-07-23 | 2011-07-22 | System and method for identifying an anatomical organ in a patient |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2012012768A1 (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013114262A1 (en) * | 2012-02-01 | 2013-08-08 | Koninklijke Philips N.V. | Object image labeling apparatus, method and program |
WO2013122523A1 (en) * | 2012-02-17 | 2013-08-22 | Advanced Mr Analytics Ab | Method of classification of organs from a tomographic image |
WO2014027243A2 (en) * | 2012-08-15 | 2014-02-20 | Questor Capital Holdings Ltd. | Probability mapping system |
DE102014201321A1 (en) * | 2013-02-12 | 2014-08-14 | Siemens Aktiengesellschaft | Determination of lesions in image data of an examination object |
JP2015009152A (en) * | 2013-07-01 | 2015-01-19 | 株式会社東芝 | Medical image processor and medical image processing program |
WO2015015343A1 (en) * | 2013-07-31 | 2015-02-05 | Koninklijke Philips N.V. | Automation of therapy planning |
WO2015080647A1 (en) * | 2013-11-28 | 2015-06-04 | Raysearch Laboratories Ab | Method and system for uncertainty based radiotherapy treatment planning |
CN110072456A (en) * | 2016-12-15 | 2019-07-30 | 皇家飞利浦有限公司 | X-ray apparatus with complex visual field |
US11024028B2 (en) | 2016-10-25 | 2021-06-01 | Koninklijke Philips N.V. | Device and method for quality assessment of medical image datasets |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007079207A2 (en) * | 2005-12-30 | 2007-07-12 | Yeda Research & Development Co. Ltd. | An integrated segmentation and classification approach applied to medical applications analysis |
US20090226060A1 (en) * | 2008-03-04 | 2009-09-10 | Gering David T | Method and system for improved image segmentation |
-
2011
- 2011-07-22 WO PCT/US2011/045092 patent/WO2012012768A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007079207A2 (en) * | 2005-12-30 | 2007-07-12 | Yeda Research & Development Co. Ltd. | An integrated segmentation and classification approach applied to medical applications analysis |
US20090226060A1 (en) * | 2008-03-04 | 2009-09-10 | Gering David T | Method and system for improved image segmentation |
Non-Patent Citations (1)
Title |
---|
GERING D. ET AL: "Image Deformation Based on a Marionette Model", vol. 37, no. 6, 1 June 2010 (2010-06-01), pages 3127, XP002661877, Retrieved from the Internet <URL:http://scitation.aip.org/getpdf/servlet/GetPDFServlet?filetype=pdf&id=MPHYA6000037000006003127000003&idtype=cvips&doi=10.1118/1.3468146&prog=normal> [retrieved on 20111020], DOI: 10.1118/1.3468146 * |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104094315A (en) * | 2012-02-01 | 2014-10-08 | 皇家飞利浦有限公司 | Object image labeling apparatus, method and program |
RU2635177C2 (en) * | 2012-02-01 | 2017-11-09 | Конинклейке Филипс Н.В. | Device, method and program of object image marking |
US9691156B2 (en) | 2012-02-01 | 2017-06-27 | Koninklijke Philips N.V. | Object image labeling apparatus, method and program |
WO2013114262A1 (en) * | 2012-02-01 | 2013-08-08 | Koninklijke Philips N.V. | Object image labeling apparatus, method and program |
WO2013122523A1 (en) * | 2012-02-17 | 2013-08-22 | Advanced Mr Analytics Ab | Method of classification of organs from a tomographic image |
US9818189B2 (en) | 2012-02-17 | 2017-11-14 | Advanced Mr Analytics Ab | Method of classification of organs from a tomographic image |
US9378462B2 (en) | 2012-08-15 | 2016-06-28 | Questor Capital Holdings Ltd. | Probability mapping system |
WO2014027243A2 (en) * | 2012-08-15 | 2014-02-20 | Questor Capital Holdings Ltd. | Probability mapping system |
WO2014027243A3 (en) * | 2012-08-15 | 2014-04-17 | Questor Capital Holdings Ltd. | Probability mapping system |
DE102014201321A1 (en) * | 2013-02-12 | 2014-08-14 | Siemens Aktiengesellschaft | Determination of lesions in image data of an examination object |
JP2015009152A (en) * | 2013-07-01 | 2015-01-19 | 株式会社東芝 | Medical image processor and medical image processing program |
JP2016525427A (en) * | 2013-07-31 | 2016-08-25 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | Treatment plan automation |
US20160166855A1 (en) * | 2013-07-31 | 2016-06-16 | Koninklijke Philips N.V. | Automation of therapy planning |
CN105473182A (en) * | 2013-07-31 | 2016-04-06 | 皇家飞利浦有限公司 | Automation of therapy planning |
WO2015015343A1 (en) * | 2013-07-31 | 2015-02-05 | Koninklijke Philips N.V. | Automation of therapy planning |
US10022560B2 (en) | 2013-07-31 | 2018-07-17 | Koninklijke Philips N.V. | Automation of therapy planning |
CN105473182B (en) * | 2013-07-31 | 2018-11-13 | 皇家飞利浦有限公司 | Treat the automation of planning |
CN106163612A (en) * | 2013-11-28 | 2016-11-23 | 光线搜索实验室公司 | Method and system based on probabilistic radiotherapy in the treatment plan |
JP2017514532A (en) * | 2013-11-28 | 2017-06-08 | レイサーチ ラボラトリーズ エービー | Method and system for radiation therapy treatment planning based on uncertainty |
WO2015080647A1 (en) * | 2013-11-28 | 2015-06-04 | Raysearch Laboratories Ab | Method and system for uncertainty based radiotherapy treatment planning |
EP2878338B1 (en) * | 2013-11-28 | 2018-04-11 | RaySearch Laboratories AB | Method and system for uncertainty based radiotherapy treatment planning |
US10300300B2 (en) | 2013-11-28 | 2019-05-28 | Raysearch Laboratories Ab | Method and system for uncertainty based radiotherapy treatment planning |
US11024028B2 (en) | 2016-10-25 | 2021-06-01 | Koninklijke Philips N.V. | Device and method for quality assessment of medical image datasets |
CN110072456A (en) * | 2016-12-15 | 2019-07-30 | 皇家飞利浦有限公司 | X-ray apparatus with complex visual field |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Li et al. | A preliminary study of using a deep convolution neural network to generate synthesized CT images based on CBCT for adaptive radiotherapy of nasopharyngeal carcinoma | |
WO2012012768A1 (en) | System and method for identifying an anatomical organ in a patient | |
CN112770811A (en) | Method and system for radiation therapy treatment planning using a deep learning engine | |
US8803910B2 (en) | System and method of contouring a target area | |
CN114206438B (en) | Predicting radiation therapy control points using projection images | |
EP2854946B1 (en) | Elasticity imaging-based planning systems for improved gating efficiency and dynamic margin adjustment in radiation therapy | |
US20110019889A1 (en) | System and method of applying anatomically-constrained deformation | |
US11682485B2 (en) | Methods and systems for adaptive radiotherapy treatment planning using deep learning engines | |
US20200105394A1 (en) | Methods and systems for adaptive radiotherapy treatment planning using deep learning engines | |
EP4126214B1 (en) | Automatically-planned radiation-based treatment | |
CN110960803B (en) | Computer system for performing adaptive radiation therapy planning | |
US12033322B2 (en) | Systems and methods for image cropping and anatomical structure segmentation in medical imaging | |
CN113891742B (en) | Method and system for continuous deep learning based radiotherapy treatment planning | |
US11478210B2 (en) | Automatically-registered patient fixation device images | |
US11406844B2 (en) | Method and apparatus to derive and utilize virtual volumetric structures for predicting potential collisions when administering therapeutic radiation | |
Su et al. | Marker-less intra-fraction organ motion tracking using hybrid ASM |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11741039 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11741039 Country of ref document: EP Kind code of ref document: A1 |