CN117815573A - Image processing method, device, system and storage medium - Google Patents

Image processing method, device, system and storage medium Download PDF

Info

Publication number
CN117815573A
CN117815573A CN202310532321.5A CN202310532321A CN117815573A CN 117815573 A CN117815573 A CN 117815573A CN 202310532321 A CN202310532321 A CN 202310532321A CN 117815573 A CN117815573 A CN 117815573A
Authority
CN
China
Prior art keywords
tumor
plane
image
medical
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310532321.5A
Other languages
Chinese (zh)
Inventor
陈佳赟
王小奇
戴建荣
廖雪吟
左嘉旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Feiyan Sihai Medical Technology Co ltd
Original Assignee
Beijing Feiyan Sihai Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Feiyan Sihai Medical Technology Co ltd filed Critical Beijing Feiyan Sihai Medical Technology Co ltd
Priority to CN202310532321.5A priority Critical patent/CN117815573A/en
Publication of CN117815573A publication Critical patent/CN117815573A/en
Pending legal-status Critical Current

Links

Abstract

The application discloses an image processing method, an image processing device, an image processing system and a storage medium. The method comprises the following steps: acquiring a first medical scanning image of a target object, and determining a tumor planning target area of the target object based on the first medical scanning image; acquiring a second medical scanning image of a target object, and determining a real-time motion trail of a tumor-related region of the target object based on the second medical scanning image; based on the tumor planned target and/or the real-time motion trajectory, performing at least one of the following: target volume confirmation, beam opening and closing, and dose estimation; wherein the first medical scan image and the second medical scan image comprise magnetic resonance images acquired based on multi-planar dynamic imaging of a portal direction, the portal direction comprising a plurality of; for any field direction, the imaging plane comprises at least a first plane along an incident path of the field and a second plane perpendicular to the incident path; the first plane and the second plane are perpendicular.

Description

Image processing method, device, system and storage medium
Technical Field
The present invention relates to the field of medical data processing, and in particular, to a method, apparatus, system and storage medium for processing medical scan images related to radiotherapy.
Background
The greatest challenge in tumor radiation therapy (hereinafter tumor radiotherapy or radiotherapy) is to obtain the highest tumor cure rate at the cost of minimal normal tissue damage. Theoretically, the simplest method of improving the efficacy of radiation therapy is to enclose and kill all cancer cells with a sufficient dose of radiation during the treatment, while retaining the surrounding normal tissue. Therefore, the cancer cell position is accurately identified and used as a target area, and the radiation is effectively modulated to maximize the radiotherapy gain ratio. Currently, improved local and systemic treatments extend survival of patients with oligometastatic disease. Facing two major challenges in radiotherapy: one is to treat a target area adjacent to a number of important organs at risk, such as: the target area in liver metastasis is surrounded by radiation sensitive organs such as duodenum, stomach, kidney, colon, etc.; and secondly, the target area and the peripheral organs at risk simultaneously have relative mobility. By precisely locking the positions of the target area and the peripheral organs at risk, the effect of radiotherapy can be effectively improved, the absorbed dose of the organs around the tumor can be reduced, the tumor control can be improved, and the toxicity of normal tissues can be reduced. Thus, accurate determination of the tumor planning target volume is one of the problems to be solved in accurate radiotherapy.
Disclosure of Invention
The application discloses an image processing method, an image processing device, an image processing system and a storage medium. The image processing method is based on the magnetic resonance image acquired by multi-plane dynamic imaging based on the field direction, and can accurately delineate the tumor planning target area.
In one aspect, the application discloses an image processing method. The image processing method comprises the steps of acquiring a first medical scanning image of a target object, and determining a tumor planning target area of the target object based on the first medical scanning image; acquiring a second medical scanning image of a target object, and determining a real-time motion trail of a tumor-related region of the target object based on the second medical scanning image; based on the tumor planned target area and/or the real-time motion trajectory, performing at least one of the following operations: target volume confirmation, beam opening and closing, and dose estimation; wherein the first medical scan image and the second medical scan image comprise magnetic resonance images acquired based on multi-planar dynamic imaging of a portal direction, the portal direction comprising a plurality of; for any field direction, the imaging plane comprises at least a first plane along an incident path of the field and a second plane perpendicular to the incident path; the first plane and the second plane are perpendicular.
According to some embodiments of the present application, in the acquiring process of the first medical scan image and the second medical scan image, the number of the field directions is the same, and the directions are the same.
According to some embodiments of the present application, the determining a tumor planning target volume comprises: determining a tumor region of the target object using machine learning, preferably using deep learning, more preferably using a deep neural network model to process the first medical scan image; expanding the tumor boundary of the tumor region to obtain the tumor planning target region; wherein the tumor margin is flared no more than 8 millimeters. .
According to some embodiments of the application, the tumor region boundary is flared by a distance of 3-8 millimeters.
According to some embodiments of the present application, for any of the portal directions, the imaging plane includes a first plane along an incident path of the portal, a second plane perpendicular to the incident path, and a third plane along the incident path of the portal; the first plane, the second plane and the third plane are perpendicular to each other.
According to some embodiments of the present application, an angle between the first plane or the third plane and the incident path is not more than 45 °; alternatively, the angle between the first plane or the third plane and the incident path is not more than 30 °; alternatively, the angle between the first plane or the third plane and the incident path is not more than 10 °; alternatively, the angle between the first plane or the third plane and the incident path is not more than 1 °.
According to some embodiments of the present application, the second angle between the perpendicular to the second plane and the incident path is no more than 45 °; or, a second included angle between the perpendicular of the second plane and the incident path is not more than 30 degrees; or, a second included angle between the perpendicular of the second plane and the incident path is not more than 10 degrees; alternatively, the second angle between the perpendicular to the second plane and the incident path is no more than 1 °.
According to some embodiments of the application, the target validation comprises: determining whether the real-time motion trail of the tumor-associated region exceeds the tumor planning target region; if yes, confirming the tumor planned target area as a target area; if not, updating the tumor planning target region based on the second medical scanning image.
According to some embodiments of the present application, the updating the tumor plan target volume based on the second medical scan image comprises: processing the second medical scanning image by using a deep neural network model, determining a tumor area of the target object and updating a tumor boundary; updating the tumor plan target region based on updating tumor region boundaries;
according to some embodiments of the present application, the beam switching comprises: determining whether the real-time motion trail of the tumor-associated region exceeds the tumor planning target region; if the real-time motion trail is positioned in the tumor planned target area, generating a first control signal and transmitting the first control signal to a ray generation device so that the ray generation device emits a ray bundle based on the first control signal; and if the real-time motion trail is positioned outside the tumor planned target area, generating a second control signal and transmitting the second control signal to the ray generation equipment so that the ray generation equipment stops emitting ray beams based on the second control signal.
According to some embodiments of the present application, the dose estimation comprises: acquiring an auxiliary medical scanning image related to the tumor of the target object; registering and fusing the second medical scanning image and the auxiliary medical scanning image to obtain a verification medical image; based on the validated medical image, a radiation dose received by the tumor planning target volume is determined.
According to some embodiments of the present application, the auxiliary medical image comprises at least a 4D-CT image or a 3D-CT image.
Another aspect of the present application provides an image processing apparatus. The device comprises: a tumor plan target volume determination module configured to acquire a first medical scan image of a target object and determine a tumor plan target volume of the target object based on the first medical scan image; the track determining module is configured to acquire a second medical scanning image of the target object and determine a real-time motion track of a tumor related area of the target object based on the second medical scanning image; an execution module configured to perform at least one of the following operations based on the tumor planned target volume and/or the real-time motion trajectory: target volume confirmation, beam opening and closing, and dose estimation; wherein the first medical scan image and the second medical scan image comprise magnetic resonance images acquired based on multi-planar dynamic imaging of a portal direction, the portal direction comprising a plurality of; for any field direction, the imaging plane comprises at least a first plane along an incident path of the field and a second plane perpendicular to the incident path; the first plane and the second plane are perpendicular.
According to some embodiments of the application, the number of the field directions is the same and the directions are the same in the process of acquiring the first medical scanning image and the second medical scanning image.
According to some embodiments of the application, to determine a tumor planned target, the target determination module is configured to: determining a tumor region of a tumor of the target object using machine learning, preferably using deep learning, more preferably using a deep neural network model to process the first medical scan image; expanding the tumor boundary of the tumor region to obtain the tumor planning target region; wherein the tumor region boundary is flared by a distance of no more than 8 millimeters.
According to some embodiments of the application, the tumor region boundary is flared by a distance of 3-8 millimeters.
According to some embodiments of the present application, for any of the portal directions, the imaging plane includes a first plane along an incident path of the portal, a second plane perpendicular to the incident path, and a third plane along the incident path of the portal; the first plane, the second plane and the third plane are perpendicular to each other.
According to some embodiments of the present application, an angle between the first plane or the third plane and the incident path is not more than 45 °; alternatively, the angle between the first plane or the third plane and the incident path is not more than 30 °; alternatively, the angle between the first plane or the third plane and the incident path is not more than 10 °; alternatively, the angle between the first plane or the third plane and the incident path is not more than 1 °.
According to some embodiments of the present application, the second angle between the perpendicular to the second plane and the incident path is no more than 45 °; or, a second included angle between the perpendicular of the second plane and the incident path is not more than 30 degrees; or, a second included angle between the perpendicular of the second plane and the incident path is not more than 10 degrees; alternatively, the second angle between the perpendicular to the second plane and the incident path is no more than 1 °.
According to some embodiments of the application, to perform the target validation operation, the execution module is configured to: determining whether the real-time motion trail of the tumor-associated region exceeds the tumor planning target region; if yes, confirming the tumor planned target area as a target area; if not, updating the tumor planning target region based on the second medical scanning image.
According to some embodiments of the application, to update the tumor plan target volume based on the second medical scan image, the execution module is configured to: processing the second medical scanning image by using a deep neural network model, determining an updated tumor region of the tumor of the target object and updating a tumor outer boundary; updating the tumor plan target region based on updating the tumor region and the tumor region outline border;
According to some embodiments of the application, to perform the beam switching operation, the performing module is configured to: determining whether the real-time motion trail of the tumor-associated region exceeds the tumor planning target region; if the real-time motion trail is positioned in the tumor planned target area, generating a first control signal and transmitting the first control signal to a ray generation device so that the ray generation device emits a ray bundle based on the first control signal; and if the real-time motion trail is positioned outside the tumor planned target area, generating a second control signal and transmitting the second control signal to the ray generation equipment so that the ray generation equipment stops emitting ray beams based on the second control signal.
According to some embodiments of the present application, to perform the dose estimation operation, the execution module is configured to: acquiring an auxiliary medical scanning image related to the tumor of the target object; registering and fusing the second medical scanning image and the auxiliary medical scanning image to obtain a verification medical image; based on the validated medical image, a radiation dose received by the tumor planning target volume is determined.
According to some embodiments of the present application, the auxiliary medical image comprises at least a 4D-CT image or a 3D-CT image, but is not limited to a 4D-CT image or a 3D-CT image.
Another aspect of the present application provides a system. The system comprises: a memory, a processor and a computer program stored on the memory and executable on the processor, which when executed by the processor, implements the steps of the image processing method as described above.
Another aspect of the present application provides a computer-readable storage medium. The storage medium has stored thereon a computer program which, when executed by a processor, implements the steps of the image processing method as described above.
Another aspect of the present application provides a medical system. The medical system comprises an image processing device as described above.
By implementing the method and the device, accurate target region delineation can be performed based on the magnetic resonance image acquired by multi-plane dynamic imaging of the field direction.
Drawings
The present application will be further illustrated by way of example embodiments, which will be described in detail with reference to the accompanying drawings. The embodiments are not limiting, in which like numerals represent like structures, wherein:
FIG. 1 is a schematic diagram of an exemplary medical imaging system shown in accordance with some embodiments of the present application;
FIG. 2 is a block diagram of an exemplary processing device shown in accordance with some embodiments of the present application;
FIG. 3 is an exemplary flow chart of an image processing method shown in accordance with some embodiments of the present application;
FIG. 4 is an exemplary flow chart of tumor plan target volume update shown according to some embodiments of the present application;
FIG. 5 is an exemplary flow chart of beam switching shown according to some embodiments of the present application;
FIG. 6 is an exemplary flow chart of dose estimation shown according to some embodiments of the present application;
fig. 7 is an exemplary block diagram of a processing device, according to some embodiments of the present application.
Detailed Description
In order to make the above objects, features and advantages of the present application more comprehensible, embodiments accompanied with figures are described in detail below. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is, however, susceptible of embodiment in many other forms than those described herein and similar modifications can be made by those skilled in the art without departing from the spirit of the application, and therefore the application is not to be limited to the specific embodiments disclosed below.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein in the description of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. The terms "first," "second," and the like, as used herein, do not denote any order, quantity, or importance, but rather are used to distinguish one element from another. The word "comprising" or "comprises", and the like, means that elements or items preceding the word are included in the element or item listed after the word and equivalents thereof, but does not exclude other elements or items. "upper", "lower", "left", "right", etc. are used merely to indicate relative positional relationships, which may also be changed when the absolute position of the object to be described is changed. The term "and/or" and/or "as used herein includes any and all combinations of one or more of the associated listed items.
Existing determinations for tumor planning targets may be made using CT images, MRI images, or both. However, conventional coronal, sagittal, and transverse images lack information about the movement, position, and displacement of the tumor and surrounding organs at the current beam path, and cannot accurately determine whether the tumor is within the beam. Although a deformation algorithm can be adopted to solve part of the problems through registration and deformation fields, a clinician is still required to collect MRI imaging of a patient before radiotherapy is implemented, register the MRI imaging of the patient with CT/MRI imaging for planning radiotherapy before radiotherapy is first performed, manually sketch for self-adaptive radiotherapy is performed on a new image, and the registered result is checked. Not only the workload of clinicians is large, but also the manual intervention is more easy to cause the failure of real-time tracking, thereby leading to poor effect of radiotherapy.
The image processing method disclosed by the application can effectively determine whether the tumor planning target area accords with the radiation therapy requirement in the beam direction by using the imaging plane scanning imaging based on the field direction when determining the tumor planning target area, and is beneficial to increasing the accuracy of the radiation therapy.
The term "tumor planned target region" in the present application refers to a tumor planned treatment target region in tumor radiotherapy, and may include a tumor region of a target object, a tumor region expansion boundary, and a region included after expansion of the tumor boundary.
Some preferred embodiments of the present application are described below with reference to the accompanying drawings. It should be noted that the following description is for illustrative purposes and is not intended to limit the scope of the present application. The flowcharts used are to illustrate the operations performed by systems according to embodiments of the present application. It should be understood that the operations described are not necessarily performed in order precisely. Rather, the various steps may be processed in reverse order or simultaneously. Also, other operations may be added to or removed from these processes.
Fig. 1 is a schematic diagram of an exemplary medical imaging system shown in accordance with some embodiments of the present invention. As shown in fig. 1, the medical imaging system 100 may include an imaging device 110, a processing device 120, one or more terminals 130, a network 140, and a storage device 150. The various components in the medical imaging system 100 may be connected in a variety of ways. For example, imaging device 110, terminal 130, and/or a storage device may be coupled to processing device 120 via network 140, or may be coupled directly to processing device 120.
The imaging device 110 may be an imaging device such as CT, MRI, PET or the like or a combination thereof. A medical scan image is generated based on scan data by scanning a target object (e.g., a patient). The medical scan image may be two-dimensional or three-dimensional. In some embodiments, the imaging device 110 may also be a medical device that incorporates radiation therapy functionality. For example, the imaging device 110 may be a device in which an MRI imaging apparatus and an accelerator (including a linear accelerator or a cyclotron) are integrated. Medical scan images obtained based on MRI imaging devices may be used for certain steps of radiation therapy. For example, tumor planning target volume prior to radiotherapy is determined. The accelerator may then generate and emit therapeutic radiation (e.g., an X-ray beam) onto the target object for killing cancer cells to effect treatment of the malignancy. The accelerator may also rotate. For example, it may rotate with the gantry of the imaging device 110. For example, rotated clockwise or counterclockwise about the gantry axis to achieve 360 degree radiation exposure, or to achieve a specific exposure angle (e.g., a predetermined field direction).
Processing device 120 may process data and/or information obtained from imaging device 110, terminal 130, and/or storage device 150. For example, the processing device 120 may acquire a medical scan image of the target object, e.g., from the imaging device 110. In some embodiments, the processing device 120 may acquire a first medical scan image of a target object and determine a tumor planning target volume of the target object based on the first medical scan image. In some embodiments, the processing device 120 may acquire a second medical scan image of the target object and determine a real-time motion profile of a tumor of the target object based on the second medical scan image. In some embodiments, the processing device 120 may perform one or more of target validation, beam switching, or dose estimation operations based on the tumor planned target and/or the real-time motion trajectories. In some embodiments, the processing device 120 may be a single server or a group of servers. The server farm may be centralized or distributed (e.g., the processing device 120 may be a distributed system). In some embodiments, the processing device 120 may be local or remote. In some embodiments, processing device 120 may obtain information and/or data from imaging device 110, terminal 130, and/or storage device 150 via network 140. Processing device 120 may also be directly connected to imaging device 110 and/or storage device 150 to access information and/or data. In some embodiments, processing device 120 may be implemented on a cloud platform. For example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multiple cloud, or the like, or any combination of the foregoing examples. In some embodiments, processing device 110 may be implemented on a computing device as shown in fig. 2 of the present application. For example, processing device 110 may be implemented on one computing device 200 as shown in fig. 2, including one or more components in computing device 200.
The terminal 130 may include one or more devices with data communication capabilities, such as a smart mobile device 130-1, a tablet computer 130-2, a notebook computer 130-3, a desktop computer 130-3, and the like. In some embodiments, the smart mobile device 120-1 may include, but is not limited to, a smart phone, smart glasses, smart watches, wearable devices, virtual display devices, display enhancement devices, and the like, or any combination thereof. In some embodiments, terminal 130 may be part of processing device 120. For example, the computing power (e.g., CPU, GPU, etc.) of the terminal 130 itself may be used to implement some of the functionality of the processing device 110. In some embodiments, terminal 130 may be a console of imaging device 110. Through the input ports of the terminal 130 (e.g., touch virtual keyboard of the mobile device 130-1 and tablet 130-2 such as a smart phone smart tablet, mouse keyboard of the notebook 130-3 and tablet 130-4, etc.), a user (e.g., doctor) may issue control instructions to the imaging device 110, such as scan imaging, beam opening and closing, position adjustment (e.g., control movement of a treatment couch carrying a target object), etc.
The network 140 may facilitate the exchange of information and/or data. In some embodiments, one or more components in the medical imaging system 100 (e.g., the imaging device 110, the processing device 120, the terminal 130, the storage device 150, etc.) may send information and/or data to other components in the medical imaging system 100 over the network 140. For example, the processing device 120 may obtain data (e.g., medical scan images) from the imaging device 110 over the network 140. In some embodiments, network 140 may include one or more of a public network (e.g., the internet), a private network (e.g., a Local Area Network (LAN), wide Area Network (WAN)), etc.), a wired network (e.g., ethernet), a wireless network (e.g., an 802.11 network, a wireless Wi-Fi network, etc.), a cellular network (e.g., a Long Term Evolution (LTE) network), a frame relay network, a Virtual Private Network (VPN), a satellite network, a telephone network, a router, a hub, a server computer, etc Several combinations. For example, network 140 may include a wired network, a fiber optic network, a telecommunications network, an intranet, the Internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a Public Switched Telephone Network (PSTN), bluetooth TM Network, zigbee TM One or more combinations of a network, near Field Communication (NFC) network, global system for mobile communications (GSM) network, code Division Multiple Access (CDMA) network, time Division Multiple Access (TDMA) network, general Packet Radio Service (GPRS) network, enhanced data rates for GSM evolution (EDGE) network, wideband Code Division Multiple Access (WCDMA) network, high Speed Downlink Packet Access (HSDPA) network, long Term Evolution (LTE) network, user Datagram Protocol (UDP) network, transmission control protocol/internet protocol (TCP/IP) network, short Message Service (SMS) network, wireless Application Protocol (WAP) network, ultra Wideband (UWB) network, mobile communications (1G, 2G, 3G, 4G, 5G) network, wi-Fi, li-Fi, narrowband internet of things (NB-IoT), infrared communications, etc. In some embodiments, network 140 may include one or more network access points. For example, network 140 may include wired and/or wireless network access points, such as base stations and/or internet switching points, through which one or more components of medical imaging system 100 may connect to network 140 to exchange data and/or information.
The storage device 150 may store data and/or instructions. In some embodiments, the storage device 150 may store data obtained from the imaging device 110, the processing device 120, and the terminal 130. In some embodiments, the storage device 150 may store data and/or instructions for execution or use by the processing device 120, which may be executed or used by the processing device 120 to implement the example methods described herein. In some embodiments, storage device 150 may include mass storage, removable storage, volatile read-write memory, read-only memory (ROM), and the like, or any combination of the above. Exemplary mass storage devices may include magnetic disks, optical disks, solid state disks, and the like. Exemplary removable memory can include flash drives, floppy disks, optical disks, memory cards, compact disks, magnetic tape, and the like. Exemplary volatile read-only memory can include Random Access Memory (RAM). Exemplary random access memories may include Dynamic Random Access Memory (DRAM), double Data Rate Synchronous Dynamic Random Access Memory (DDRSDRAM), static Random Access Memory (SRAM), silicon controlled random access memory (T-RAM), zero capacitance memory (Z-RAM), and the like. Exemplary read-only memory may include masked read-only memory (MROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), compact disk read-only memory (CD-ROM), digital versatile disk read-only memory, and the like. In some embodiments, storage device 150 may be implemented on a cloud platform. For example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multiple cloud, or the like, or any combination of the foregoing examples. For example, some algorithms or data in the present application may be stored on a cloud platform, and updated periodically, and the processing device 120 accesses the algorithms or data through a network, so as to implement unification and interaction of the algorithms or data of the whole platform.
In some embodiments, the storage device 150 may be connected to the network 140 to enable communication with one or more components in the medical imaging system 100 (e.g., the imaging device 110, the processing device 120, the terminal 130, etc.). One or more components of medical imaging system 100 may access data or instructions stored in storage device 150 through network 140. In some embodiments, the storage device 150 may be directly connected to or in communication with one or more components of the medical imaging system 100 (e.g., the imaging device 110, the processing device 120, the terminal 130, etc.). In some embodiments, the storage device 150 may be part of the processing device 120.
Fig. 2 is a block diagram of an exemplary processing device, shown in accordance with some embodiments of the present application. Processing device 120 may include any of the components used to implement the systems described in embodiments of the present application. For example, the processing device 120 may be implemented in hardware, software programs, firmware, or a combination thereof. For convenience, only one processing device is depicted, but the computing functions described in connection with the medical imaging system 100 of the embodiments herein may be implemented in a distributed manner by a set of similar platforms to distribute the processing load of the system.
In some embodiments, processing device 120 may include a processor 210, a memory 220, an input/output component 230, and a communication port 240. In some embodiments, the processor (e.g., CPU) 210 may execute program instructions in the form of one or more processors. In some embodiments, the memory 220 includes various forms of program memory and data memory, such as a hard disk, read-only memory (ROM), random Access Memory (RAM), etc., for storing a wide variety of data files for processing and/or transmission by a computer. In some embodiments, the input/output component 230 may be used to support input/output between the processing device 110 and other components. In some embodiments, the communication port 240 may be connected to a network for enabling data communication. An exemplary processing device may include program instructions stored in read-only memory (ROM), random Access Memory (RAM), and/or other types of non-transitory storage media for execution by processor 210. The methods and/or processes of embodiments of the present application may be implemented in the form of program instructions. The processing device 120 may also receive the programs and data disclosed in the present application via network communication.
For ease of understanding, only one processor is schematically depicted in fig. 2. However, it should be noted that the processing device 120 in the embodiments of the present application may include multiple processors, and thus the operations and/or methods described in the embodiments of the present application as being implemented by one processor may also be implemented by multiple processors jointly or independently. For example, if in the present application the processors of processing device 120 perform operations a and B, it should be understood that operations a and B may also be performed jointly or independently by two different processors of processing device 120 (e.g., a first processor performing operation a, a second processor performing operation B, or both first and second processors jointly performing operations a and B).
Fig. 3 is an exemplary flow chart of an image processing method shown in accordance with some embodiments of the present application. In some embodiments, the process 300 may be performed by the processing system 700. For example, the flow 300 may be stored in a storage device (e.g., a self-contained memory unit or an external memory device of the processing system 700) in the form of a program or instructions that, when executed, may implement the flow 300. As shown in fig. 3, the flow 300 may include the following operations.
Step 310, a first medical scan image of a target object is acquired and a tumor planning target volume of the target object is determined based on the first medical scan image. This step may be performed by the target determination module 710.
In some embodiments, the target object may be a patient, an animal, or the like, or a portion thereof. For example, certain organs, tissues, etc. of the patient. Illustratively, the target object may be a suspected tumor region. For example, if the patient suffers from lung cancer, liver cancer, stomach cancer, breast cancer, etc., the target object may be the whole thoracic cavity, abdominal cavity, etc. of the patient. The first medical scan image may include an image of the target object scanned using various types of related imaging devices (e.g., imaging device 110 of fig. 1). The imaging device may include CT, MRI, PET, PET-CT, C-arm (e.g., DSA), etc. Such as CT images, MRI images, PET-CT images, DSA images, etc. acquired after operations such as scanning, data acquisition, data processing, image reconstruction, etc. may be referred to as the first medical scan image.
In some embodiments, the first medical scan image may comprise a magnetic resonance image acquired based on multi-planar dynamic imaging of the field direction. By way of example, the imaging device (e.g., imaging device 110) may be a medical device that combines MRI imaging and radiotherapy capabilities. The irradiation direction of the ray beam can be used for radiotherapy, and can be represented by the rotation angle of the stand. For example, the field direction may be 0 °, 90 °, 180 °, 270 °, 300 °, 330 °, and the like. The direction of the radiation field may be preset, and may be plural. For example, different numbers of field directions may be set according to the types of different tumors. For example, 3, 5, 7, 9, etc., or others.
In some embodiments, based on the field direction, an imaging plane of a respective magnetic resonance scan may be set. For any one of the portal directions, the imaging plane may include at least a first plane along an incident path of the portal, and a second plane perpendicular to the incident path. Wherein the first plane and the second plane are perpendicular to each other. Taking the xyz three-dimensional axis as an example, assuming that the incident path corresponding to a certain field direction is along the x-axis, the first plane may be a plane containing or parallel to the x-axis. For simplicity of illustration, the first plane may be an xoy plane. Then, the second plane perpendicular to the incident path and perpendicular to the first plane may be a xoz plane or a yoz plane. In some embodiments, for any one field direction, the imaging plane may include a first plane along an incident path of the field, a second plane perpendicular to the incident path, and a third plane along the incident path of the field. The first plane, the second plane and the third plane are perpendicular to each other. With continued reference to the example above, the first plane may be an xoy plane and the second plane may be a yoz plane or xoz plane. The third plane may be a xoz plane or a yoz plane. In some embodiments, each imaging plane may be isocentric in process. The isocenter may be located within the tumor. Each imaging plane may cover the isocenter.
In some embodiments, the imaging plane (e.g., the first plane or the third plane) along the incident path of the field may not be perfectly matched, allowing for a reasonable deviation therebetween. For example, an angle may exist between the first plane or the third plane and the incident path, and the angle may not exceed 45 °. Alternatively, the included angle may be no more than 45 °, 30 °, 20 °, 10 °, 9 °, 8 °, 7 °, 6 °, 5 °, 4 °, 3 °, 2 °, or 1 °. A perpendicular to an incident path and to an imaging plane (e.g., the second plane) perpendicular to the remaining imaging planes may also be offset from the incident path. For example, an angle may exist between a perpendicular to the second plane and a second angle between the incident path, and the angle may be no more than 45 °, 30 °, 20 °, 10 °. Alternatively, the included angle may be no more than 9 °, 8 °, 7 °, 6 °, 5 °, 4 °, 3 °, 2 °, or 1 °.
In some embodiments, the first medical scan image may contain a plurality of slice image datasets. For example, an image dataset consisting of slice images corresponding to three mutually orthogonal imaging planes. One image dataset may correspond to one field direction. And, these slice images can be simultaneously reconstructed to generate a three-dimensional stereoscopic image.
In some embodiments, the imaging parameters used to acquire the first medical scan image may be predetermined, including imaging center coordinates, imaging range FOV, imaging plane, imaging scan time, and the like. These imaging parameters may be pre-stored in a self-contained memory unit or external memory device of the processing system 700. The call may be made directly when medical imaging is performed. At the same time, the resulting first medical scan image after imaging is completed may be stored (e.g., in a memory unit or memory device as described above) and communicated thereto for acquisition by the target determination module 710. Or directly to the target determination module 710.
In some embodiments, the tumor planning target may refer to an area that receives beam illumination. The region may include a tumor subject and potentially invaded regions distributed about the tumor subject. The target determination module 710 may use a target recognition/image segmentation algorithm for processing the first medical scan image to determine a tumor boundary of a tumor region of the target object. The object recognition/image segmentation algorithm may be implemented using machine learning, preferably using deep learning, more preferably using a deep neural network model. Exemplary target recognition algorithms can include R-CNN, SPP-Net, fast R-CNN, R-FCN, overFeat, YOLO, SSD, DSSD, and the like. Exemplary image segmentation algorithms may include threshold-based algorithms, cluster-based algorithms, region-based algorithms, graph-based algorithms, active boundaries and level sets, neural network-based algorithms, and the like. Exemplary threshold-based algorithms include, but are not limited to, direct thresholding, iterative thresholding, triangular thresholding, maximum inter-class variance OTSU, sauvola local binarization algorithms, and the like. Exemplary cluster-based algorithms include, but are not limited to, the Mean Shift algorithm, the K-Means algorithm, the Fuzzy C-Means algorithm, the Medoid Shift algorithm, the Turboidels algorithm, and the like. Exemplary region-based algorithms include, but are not limited to, thresholding, region growing, region merging, region splitting merging, and the like. Exemplary graph cut-based algorithms include, but are not limited to, the Goldberg-Tarjan algorithm, the Ford-Fulkerson algorithm, and the like. Exemplary neural network-based algorithms can include, but are not limited to CNNs, FCNs, U-Net, V-Net, CRNs, RNNs, and the like. Likewise, the target recognition algorithm and/or the image segmentation algorithm may be pre-trained and stored. Such as in a self-contained storage unit or an external storage device of the processing system 700. The target determination module 710 may directly invoke the target recognition algorithm and/or the image segmentation algorithm when making a tumor-planned target determination.
It will be appreciated that medical scanning is not instantaneously completed. The first medical scan image may be a medical image having "dynamic" properties, as voluntary movements (e.g. breathing) or involuntary movements of the patient (e.g. movements of organs such as heart beats, etc., peristalsis of the stomach and intestine) may lead to a change in the position of the tumor. The first medical scan image may be, for example, a scan of the target object over a predetermined period of time. For example, the preset time period may cover approximately 10 respiratory cycles of the patient. In this way, the range of motion of the tumor may be embodied on the first medical scan image. Thus, the tumor planning target may also be a range of motion covering the tumor subject and potentially invasive region as described above. By processing all images corresponding to the first medical scan image, the resulting tumor planning target volume may be a range that includes the tumor subject and the potentially invasive region, as well as their range of motion.
In some embodiments, target determination module 710 may expand the tumor boundary to determine the tumor planned target. Boundary flaring may be to set aside an amount of error (e.g., uncontrolled movement of the patient, repositioning effects on radiotherapy, etc., resulting in unexpected positional changes of the tumor) to ensure that the tumor subject and potentially invasive region receive prescribed radiation doses. In some embodiments, the tumor region boundary is flared no more than 8 millimeters. For example, the tumor boundary may be flared 8 mm, 7 mm, 6 mm, 5 mm, 4 mm, 3 mm, 2 mm, 1 mm, etc. In some embodiments, the tumor region boundary is flared a distance of 3-8 millimeters. For example, for a tumor in the head and neck region of a patient, the tumor region boundary flare may be 3-5 millimeters. For another example, the tumor region boundary flare may be 5-8 mm for a tumor in the chest and abdomen region of a patient. In order to obtain the radiotherapy effect, the distance of the boundary expansion of the tumor area commonly used in the existing tumor radiotherapy operation is 5-10 mm. The range obtained after the tumor area boundary is expanded can be the range of the tumor planning target area. By acquiring medical scan images using multi-plane dynamic imaging based on the field direction, the scope of a tumor subject and potentially invasive regions, and their range of motion, can be precisely determined. Thus, boundary flaring may be small. For example, 1 mm, even without flaring. In this way, the exposure of normal tissue to unnecessary radiation can be reduced.
In some embodiments, the tumor planning target volume may be represented using a set of coordinate points. For example in the world coordinate system, in the anatomical coordinate system (patient coordinate system) or in the medical image coordinate system. The tumor planning target volume may be represented using a set of coordinate points of a boundary of the tumor planning target volume.
Step 120, acquiring a second medical scanning image of the target object, and determining a real-time motion trajectory of a tumor of the target object based on the second medical scanning image. This step may be performed by the trajectory determination module 710.
In some embodiments, the second medical scan image is the same as or similar to the first medical scan image, and may be obtained after scanning the target object using various types of related imaging devices (e.g., CT, MRI, PET, PET-CT, C-arm (e.g., DSA)). In some embodiments, the second medical scan image is preferably obtained using MRI. In some embodiments, the second medical scan image may also be a magnetic resonance image acquired based on multi-planar dynamic imaging of the field direction. The number and the direction of the corresponding field directions are the same as the field directions corresponding to the first medical scanning image. The presented data form (e.g., a "dynamic" three-dimensional image presented in a slice dataset and reconstructable to correspond to each field direction) is also the same.
Similarly or in the same way, the trajectory determination module 710 may also process the second medical scan image using a target recognition/image segmentation algorithm for determining a real-time motion trajectory of a tumor-associated region of the target object. The tumor-associated region may be a tumor subject as in the above description as well as a potentially invaded region. The real-time motion trajectory may be a real-time motion trajectory of feature points of the tumor-associated region. For example, the extent of the tumor-associated region may be highlighted in the second medical scan image with an identification symbol (e.g., a line segment of a differential color) after processing by a target recognition/image segmentation algorithm. The tangent points of these line segments and a plurality of straight lines may be used as feature points.
In some embodiments, the first medical scan image and the second medical scan image may be acquired at different time periods. For the example of radiation therapy, the first medical scan image may be performed prior to radiation therapy. For example, using an MRI analog positioner. The second medical scan image may appear to be performed during radiation therapy. For example, on a radiotherapy apparatus using MRI guidance.
In some embodiments, the real-time motion profile may also be represented using a set of coordinate points. For example, the real-time motion trajectory is represented by a set of coordinate points in the same coordinate system in which the coordinate points representing the tumor planned target area are located.
Step 330, based on the tumor planned target area and/or the real-time motion trajectory, performing at least one of the following operations: target volume confirmation, beam opening and closing, and dose estimation. This step may be performed by the execution module 730.
In connection with the foregoing, when the first and second medical scan images are acquired before and during radiotherapy, respectively, it will be appreciated that inaccuracies in the planning of the target region of the tumor may result due to differences (e.g., involuntary or uncontrolled movements of the patient during radiotherapy, or machine differences used in simulating positioning and radiotherapy, etc.). For example, a tumor-associated area receiving radiation may be out of range of the tumor intended target area, which may result in unnecessary radiation exposure to normal tissue. In addition, the evaluation of the radiotherapy effect based on the medical scan image after radiotherapy can be beneficial to the improvement of the radiotherapy process.
Based on this, a tumor planned target volume determined from the first medical scan image and/or a real-time motion trajectory determined from the second medical scan image may be used to achieve the above-mentioned object. For example, the execution module 730 may compare whether the real-time motion profile is outside the range of the tumor plan target to determine whether the tumor plan target is reasonable. For another example, the execution module 730 may control the opening and closing of the beam in combination with the range of the tumor planned target area and the range of the real-time motion trajectory. For another example, the execution module 730 may acquire real-time CT images embodying the radiotherapy process in conjunction with auxiliary medical scan images, such as CT images, to determine the actual radiation exposure dose received by the target object. Other descriptions of target volume confirmation, beam opening and closing, and dose estimation may be found elsewhere in this application, e.g., in fig. 4-6. The auxiliary medical scan image may also be an MRI image.
It should be noted that the above description of the steps in fig. 3 is for illustration and description only, and does not limit the application scope. Various modifications and changes to the individual steps of fig. 3 may be made by those skilled in the art under the guidance of this application. However, such modifications and variations are still within the scope of the present application.
According to the image processing method disclosed by the application, target area sketching and confirmation can be carried out according to the medical scanning image obtained by multi-plane dynamic imaging based on the direction of the radiation field, the boundary outer-discharge range is accurately and effectively reduced by segmentation, and the probability of receiving unnecessary ray beams by normal tissues is reduced.
{ { FIG. 4}
Fig. 4 is an exemplary flow chart of tumor plan target volume update according to some embodiments of the present application. In some embodiments, the process 400 may be performed by an execution module 730 of the processing system 700. For example, the flow 400 may be stored in a storage device (e.g., a self-contained memory unit or an external memory device of the processing system 700) in the form of a program or instructions that, when executed, may implement the flow 400. As shown in fig. 4, the flow 400 may include the following operations.
Step 410, determining whether the real-time motion trajectory of the tumor-associated region exceeds the tumor planned target region.
In some embodiments, the set of coordinate points corresponding to the tumor-planned target region and the set of coordinate points corresponding to the tumor-associated region may be used to compare to determine whether the real-time motion trajectory exceeds the tumor-planned target region. For example, it may be compared whether the coordinate points representing the real-time motion trajectories are located within a coordinate range enclosed by the coordinate points representing the tumor planning target region. If yes, the real-time motion track of the tumor related area is not beyond the tumor planning target area. The flow 400 may proceed to step 420. If not, the real-time motion track of the tumor related area exceeds the tumor planning target area. The process 400 may proceed to step 430.
Step 420, determining the tumor planned target area as a target area.
In some embodiments, when the real-time motion trajectory of the tumor-associated region does not exceed the tumor-planned target region, it is considered that the tumor-associated region will receive all radiation during the entire radiation emission for the tumor-planned target region without the occurrence of a reduction in the amount of radiation due to overstep or the entry of normal tissue into the irradiation field. At this time, the tumor planned target may be determined as a target. The target volume may be used for a subsequent radiotherapy procedure.
Step 430, updating the tumor plan target volume based on the second medical scan image.
In some embodiments, an off-target condition may be considered to occur when the real-time motion profile of the tumor-associated region exceeds the tumor-planned target region. That is, the normal tissue is irradiated with radiation due to the movement trace of the tumor-associated region being out of the range of the tumor-planned target region. In addition, even if this is not the case, the dose of radiation to the tumor-associated area is reduced. If radiation therapy is performed, the therapeutic effect will be reduced. Thus, the tumor planned target volume determined in flow 300 needs to be updated to avoid this.
Referring to step 310 of the procedure 300, the object recognition/image segmentation algorithm may be used to process the first medical scan image to redefine a new tumor planning target volume. For example, the second medical scan image may first be processed using a target recognition/image segmentation algorithm to determine a new tumor boundary of the tumor of the target object. This new tumor boundary can then be extrapolated to obtain a new tumor planning target. The distance of flaring may be no more than 5 millimeters, 4 millimeters, 3 millimeters, 2 millimeters, 1 millimeter, etc., as the same or similar. The distance between the boundary of the tumor area which is commonly used in the existing tumor radiotherapy operation is 5-10 mm. The newly determined tumor planned target volume may be used for subsequent radiotherapy-related procedures. For example, as a radiation irradiation region during radiotherapy.
It should be noted that the above description of the steps in fig. 4 is for illustration and description only, and does not limit the application scope. Various modifications and changes to the individual steps of fig. 4 may be made by those skilled in the art under the guidance of this application. However, such modifications and variations are still within the scope of the present application.
{ { FIG. 5}
Fig. 5 is an exemplary flow chart of beam switching according to some embodiments of the present application. In some embodiments, the process 500 may be performed by the execution module 730 of the processing system 700. For example, the flow 500 may be stored in a storage device (e.g., a self-contained memory unit or an external memory device of the processing system 700) in the form of a program or instructions that, when executed, may implement the flow 500. As shown in fig. 5, the flow 500 may include the following operations.
Step 510, determining whether the real-time motion trajectory of the tumor-associated region exceeds the tumor planning target region.
In some embodiments, step 510 may be the same as or similar to step 410, and may be described with reference to the relevant portions. If the real-time motion profile of the tumor-associated region does not exceed the tumor-planned target, the process 500 may proceed to step 520. And if the real-time motion trail of the tumor-associated region exceeds the tumor planning target area. The process 500 may proceed to step 530.
Step 520, generating a first control signal and transmitting to a radiation generating device, such that the radiation generating device emits a radiation beam based on the first control signal.
In some embodiments, when the real-time motion profile of the tumor-associated region does not exceed the tumor-planned target region, it is considered that the tumor-associated region will be within a predetermined irradiation range of radiation throughout the radiation emission process for the tumor-planned target region. That is, in this case a beam of radiation is applied to the target object, which will only irradiate the tumor-associated area. Thus, a first control signal for controlling the radiation generating device, such as an accelerator, to emit a radiation beam may be generated and transmitted. The radiation generating device may start to operate to emit a radiation beam based on information carried by the first control signal after receiving the signal.
And step 530, generating a second control signal and transmitting the second control signal to the ray generation device so that the ray generation device stops emitting the ray beam based on the second control signal.
In some embodiments, an off-target condition may be considered to occur when the real-time motion profile of the tumor-associated region exceeds the tumor-planned target region. That is, in this case, the radiation beam is applied to the target object in this case, and the radiation will impinge on normal tissue with additional damage. At this time, a second control signal for controlling the radiation generating device to stop emitting the radiation beam or to stop operating may be generated and transmitted. After receiving the second control signal, the ray generating device can stop running or close the emission of the ray beam based on the information carried by the signal.
It should be noted that the above description of the steps in fig. 5 is for illustration and description only, and does not limit the application scope. Various modifications and changes to the steps of fig. 5 may be made by those skilled in the art under the guidance of the present application. However, such modifications and variations are still within the scope of the present application.
{ { FIG. 6}
Fig. 6 is an exemplary flow chart of dose estimation according to some embodiments of the present application. In some embodiments, the process 600 may be performed by an execution module 730 of the processing system 700. For example, the flow 600 may be stored in a storage device (e.g., a self-contained memory unit or an external memory device of the processing system 700) in the form of a program or instructions that, when executed, may implement the flow 600. As shown in fig. 6, the flow 600 may include the following operations.
At step 610, a tumor-related auxiliary medical scan image of the target object is acquired.
In some embodiments, the auxiliary medical scan image may be an image having a different modality than the first medical scan image and the second medical scan image. Such as CT images, PET images, US images, etc. The auxiliary medical scan image may be a CT image, for example a 3D-CT image or a 4D-CT image, for example. In some embodiments, the auxiliary medical scan image may be acquired in advance. For example, before beam emission begins. Or before the process 100 begins. And may then be stored for later use.
Step 620, registering and fusing the second medical scan image with the auxiliary medical scan image to obtain a verification medical image.
In some embodiments, an image registration/image fusion algorithm may be used to process the second medical scan image and the auxiliary medical scan image, with the two being fused after registration to obtain the verification medical image. Exemplary image registration algorithms may include mutual information-based image registration algorithms, fourier transform-based image registration algorithms, wavelet transform-based image registration algorithms, deep learning-based image registration algorithms such as CNN, superCNN, GAN, deepsimilarity, RL, regvalidation, and the like. Exemplary image fusion algorithms may include multi-resolution pyramids, wavelet transforms, deep neural network based models such as CNN, IFCNN, zero-LMF, CNN-CO, EMFusion, DPCN-Fusion, MSPRN, DDcGAN, MAMDcGAN, DCGAN, GFPPCIF, fusionDN, PMGI, cu-Net, SDNet, DIF-Net, deepFuse, IFSepR, MTOE, and the like.
Step 630, determining a radiation dose received by the tumor planning target volume based on the validated medical image.
In some embodiments, the total dose of radiation throughout the radiation exposure may be mapped onto the validated medical image, e.g., onto the tumor planned target (or the tumor-associated region), and the dose of radiation received by the tumor planned target throughout the radiation exposure may be obtained.
{ { FIG. 7}
Fig. 7 is an exemplary block diagram of a processing system for image processing according to some embodiments of the present application. The processing system may enable accurate processing of medical scan images such as target volume accurate determination. As shown in fig. 7, processing system 700 may include a target determination module 710, a trajectory determination module 720, and an execution module 730.
The target determination module 710 may be configured to acquire a first medical scan image of a target object and determine a tumor-planned target of the target object based on the first medical scan image. The target object may be a patient, an animal or the like or a part thereof. For example, certain organs, tissues, etc. of the patient. Illustratively, the target object may be a suspected tumor region. For example, if the patient suffers from lung cancer, liver cancer, stomach cancer, breast cancer, etc., the target object may be the whole thoracic cavity, abdominal cavity, etc. of the patient. The first medical scan image may comprise a magnetic resonance image acquired based on multi-planar dynamic imaging of the field direction. The irradiation direction of the ray beam can be used for radiotherapy, and can be represented by the rotation angle of the stand. For example, the field direction may be 0 °, 90 °, 180 °, 270 °, 300 °, 330 °, and the like. Based on the field direction, an imaging plane of the corresponding magnetic resonance scan can be set. For any one of the portal directions, the imaging plane may include at least a first plane along an incident path of the portal, and a second plane perpendicular to the incident path. For any one of the portal directions, the imaging plane may include a first plane along an incident path of the portal, a second plane perpendicular to the incident path, and a third plane along the incident path of the portal. The first plane, the second plane and the third plane are perpendicular to each other. The imaging planes (e.g., the first plane or the third plane) along the incident path of the portal may not be perfectly matched, allowing for a reasonable deviation therebetween. An angle may be present between the first plane or the third plane and the incident path, which may not exceed 45 °, 30 °, 20 °, 10 °, 9 °, 8 °, 7 °, 6 °, 5 °, 4 °, 3 °, 2 ° or 1 °. An angle may be present between a perpendicular to the second plane and a second angle between the incident path, which may not exceed 45 °, 30 °, 20 °, 10 °, 9 °, 8 °, 7 °, 6 °, 5 °, 4 °, 3 °, 2 °, or 1 °. The tumor planning target may refer to an area that receives radiation from the beam. The region may include a tumor subject and potentially invaded regions distributed about the tumor subject. The target determination module 710 may be configured to process the first medical scan image using a target recognition/image segmentation algorithm to determine a tumor boundary of a tumor of the target object. The object recognition/image segmentation algorithm may be implemented using machine learning, preferably using deep learning, more preferably using a deep neural network model. The target determination module 710 may expand the tumor boundary to determine the tumor planned target. The tumor region boundary is flared no more than 5 millimeters. The distance between the boundary of the tumor area which is commonly used in the existing tumor radiotherapy operation is 5-10 mm.
The trajectory determination module 720 may be configured to acquire a second medical scan image of the target object and determine a real-time motion trajectory of a tumor of the target object based on the second medical scan image. The second medical scan image is the same as or similar to the first medical scan image, and may be a magnetic resonance image acquired based on multi-plane dynamic imaging of the field direction. The number and the direction of the corresponding field directions are the same as the field directions corresponding to the first medical scanning image. Similarly or in the same way, the trajectory determination module 710 may also process the second medical scan image using a target recognition/image segmentation algorithm for determining a real-time motion trajectory of a tumor-associated region of the target object.
The execution module 730 may be configured to perform at least one of the following operations based on the tumor planned target and/or the real-time motion profile: target volume confirmation, beam opening and closing, and dose estimation. The execution module 730 may compare whether the real-time motion profile is outside the range of the tumor plan target to determine whether the tumor plan target is reasonable. The execution module 730 may also control the opening and closing of the beam in combination with the range of the tumor planned target area and the range of the real-time motion trajectory. The execution module 730 may also acquire real-time CT images embodying the radiotherapy process in combination with auxiliary medical scan images such as CT images to determine the actual radiation exposure dose received by the target object.
For further description of the above modules, reference may be made to other parts of the application, for example, fig. 3 to 6.
It should be appreciated that the system shown in fig. 7 and its modules may be implemented in a variety of ways. For example, in some embodiments, the system and its modules may be implemented in hardware, software, or a combination of software and hardware. Wherein the hardware portion may be implemented using dedicated logic; the software portions may then be stored in a memory and executed by a suitable instruction execution system, such as a microprocessor or special purpose design hardware. Those skilled in the art will appreciate that the methods and systems described above may be implemented using computer executable instructions and/or embodied in processor control code, such as provided on a carrier medium such as a magnetic disk, CD or DVD-ROM, a programmable memory such as read only memory (firmware), or a data carrier such as an optical or electronic signal carrier. The system and its modules of the present application may be implemented not only with hardware circuitry, such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, etc., or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., but also with software, such as executed by various types of processors, and with a combination of the above hardware circuitry and software (e.g., firmware).
It should be noted that the above description of the modules is for convenience of description only and is not intended to limit the application to the scope of the illustrated embodiments. It will be appreciated by those skilled in the art that, given the principles of the system, various modules may be combined arbitrarily or a subsystem may be constructed in connection with other modules without departing from such principles. For example, each module may share one memory module, or each module may have a respective memory module. Such variations are within the scope of the present application.
Having described the basic concepts herein, it will be apparent to those skilled in the art that the foregoing detailed disclosure is by way of example only and is not intended to be limiting. Although not explicitly described herein, various modifications, improvements, and adaptations of the present application may occur to one skilled in the art. Such modifications, improvements, and modifications are intended to be suggested within this application, and are therefore within the spirit and scope of the exemplary embodiments of this application.
Meanwhile, the present application uses specific words to describe embodiments of the present application. Reference to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic is associated with at least one embodiment of the present application. Thus, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various positions in this application are not necessarily referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the present application may be combined as suitable.
Furthermore, those skilled in the art will appreciate that the various aspects of the invention are illustrated and described in the context of a number of patentable categories or circumstances, including any novel and useful procedures, machines, products, or materials, or any novel and useful modifications thereof. Accordingly, aspects of the present application may be performed entirely by hardware, entirely by software (including firmware, resident software, micro-code, etc.) or by a combination of hardware and software. The above hardware or software may be referred to as a "data block," module, "" engine, "" unit, "" component, "or" system. Furthermore, aspects of the present application may take the form of a computer product, comprising computer-readable program code, embodied in one or more computer-readable media.
The computer storage medium may contain a propagated data signal with the computer program code embodied therein, for example, on a baseband or as part of a carrier wave. The propagated signal may take on a variety of forms, including electro-magnetic, optical, etc., or any suitable combination thereof. A computer storage medium may be any computer readable medium that can communicate, propagate, or transport a program for use by or in connection with a instruction control system, apparatus, or device. Program code located on a computer storage medium may be propagated through any suitable medium, including radio, cable, fiber optic cable, RF, or the like, or a combination of any of the foregoing.
The computer program code necessary for operation of portions of the present application may be written in any one or more programming languages, including an object oriented programming language such as Java, scala, smalltalk, eiffel, JADE, emerald, C ++, c#, vb net, python, etc., a conventional programming language such as C language, visual Basic, fortran 2003, perl, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, ruby and Groovy, or other programming languages, etc. The program code may execute entirely on the user's computer or as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any form of network, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or the use of services such as software as a service (SaaS) in a cloud computing environment.
Furthermore, the order in which the elements and sequences are presented, the use of numerical letters, or other designations are used in the application and are not intended to limit the order in which the processes and methods of the application are performed unless explicitly recited in the claims. While certain presently useful inventive embodiments have been discussed in the foregoing disclosure, by way of various examples, it is to be understood that such details are merely illustrative and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements included within the spirit and scope of the embodiments of the present application. For example, while the system components described above may be implemented by hardware devices, they may also be implemented solely by software solutions, such as installing the described system on an existing server or mobile device.
Likewise, it should be noted that in order to simplify the presentation disclosed herein and thereby aid in understanding one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure, however, is not intended to imply that more features than are presented in the claims are required for the subject application. Indeed, less than all of the features of a single embodiment disclosed above.
In some embodiments, numbers describing the components, number of attributes are used, it being understood that such numbers being used in the description of embodiments are modified in some examples by the modifier "about," approximately, "or" substantially. Unless otherwise indicated, "about," "approximately," or "substantially" indicate that the number allows for a 20% variation. Accordingly, in some embodiments, numerical parameters set forth in the specification and claims are approximations that may vary depending upon the desired properties sought to be obtained by the individual embodiments. In some embodiments, the numerical parameters should take into account the specified significant digits and employ a method for preserving the general number of digits. Although the numerical ranges and parameters set forth herein are approximations that may be employed in some embodiments to confirm the breadth of the range, in particular embodiments, the setting of such numerical values is as precise as possible.
Each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., cited in this application is hereby incorporated by reference in its entirety. Except for application history documents that are inconsistent or conflicting with the present application, documents that are currently or later attached to this application for which the broadest scope of the claims to the present application is limited. It is noted that the descriptions, definitions, and/or terms used in the subject matter of this application are subject to such descriptions, definitions, and/or terms if they are inconsistent or conflicting with such descriptions, definitions, and/or terms.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present application. Other variations are also possible within the scope of this application. Thus, by way of example, and not limitation, alternative configurations of embodiments of the present application may be considered in keeping with the teachings of the present application. Accordingly, embodiments of the present application are not limited to only the embodiments explicitly described and depicted herein.

Claims (27)

1. An image processing method, the method comprising:
acquiring a first medical scanning image of a target object, and determining a tumor planning target area of the target object based on the first medical scanning image;
Acquiring a second medical scanning image of a target object, and determining a real-time motion trail of a tumor-related region of the target object based on the second medical scanning image;
based on the tumor planned target area and/or the real-time motion trajectory, performing at least one of the following operations: target volume confirmation, beam opening and closing, and dose estimation;
wherein,
the first medical scan image and the second medical scan image comprise magnetic resonance images acquired based on multi-planar dynamic imaging of a portal direction, the portal direction comprising a plurality of; for any field direction, the imaging plane comprises at least a first plane along an incident path of the field and a second plane perpendicular to the incident path; the first plane and the second plane are perpendicular.
2. The image processing method according to claim 1, wherein the number of the field directions is the same and the directions are the same in the acquisition process of the first medical scan image and the second medical scan image.
3. The image processing method of claim 1, wherein the determining a tumor planned target area comprises:
determining a tumor region of the target object using machine learning, preferably using deep learning, more preferably using a deep neural network model to process the first medical scan image; expanding the tumor boundary of the tumor region to obtain the tumor planning target region; wherein the distance of the outer expansion boundary of the tumor area is not more than 8 mm.
4. The image processing method according to claim 3, wherein the distance of the tumor region outer boundary is 3-8 mm.
5. The image processing method according to claim 1, wherein for any of the portal directions, the imaging plane includes a first plane along an incident path of the portal, a second plane perpendicular to the incident path, and a third plane along the incident path of the portal; the first plane, the second plane and the third plane are perpendicular to each other.
6. The image processing method according to claim 5, wherein an angle between the first plane or the third plane and the incident path is not more than 45 °; alternatively, the angle between the first plane or the third plane and the incident path is not more than 30 °; alternatively, the angle between the first plane or the third plane and the incident path is not more than 10 °; alternatively, the angle between the first plane or the third plane and the incident path is not more than 1 °.
7. The image processing method according to claim 5, wherein a second angle between a perpendicular to the second plane and the incident path is not more than 45 °; or, a second included angle between the perpendicular of the second plane and the incident path is not more than 30 degrees; or, a second included angle between the perpendicular of the second plane and the incident path is not more than 10 degrees; alternatively, the second angle between the perpendicular to the second plane and the incident path is no more than 1 °.
8. The image processing method according to claim 1, wherein the target area confirmation includes:
determining whether the real-time motion trail of the tumor-associated region exceeds the tumor planning target region;
if yes, confirming the tumor planned target area as a target area;
if not, updating the tumor planning target region based on the second medical scanning image.
9. The image processing method of claim 8, wherein the updating the tumor plan target volume based on the second medical scan image comprises:
processing the second medical scan image using machine learning, preferably using deep learning, more preferably using a deep neural network model, determining a tumor region of the target object and updating a tumor region outer boundary;
updating the tumor planned target region based on updating the tumor region expansion boundary.
10. The image processing method according to claim 1, wherein the beam switching includes:
determining whether the real-time motion trail of the tumor-associated region exceeds the tumor planning target region;
if the real-time motion trail is positioned in the tumor planned target area, generating a first control signal and transmitting the first control signal to a ray generation device so that the ray generation device emits a ray bundle based on the first control signal;
And if the real-time motion trail is positioned outside the tumor planned target area, generating a second control signal and transmitting the second control signal to the ray generation equipment so that the ray generation equipment stops emitting ray beams based on the second control signal.
11. The image processing method according to claim 1, wherein the dose estimation includes:
acquiring an auxiliary medical scanning image related to the tumor of the target object;
registering and fusing the second medical scanning image and the auxiliary medical scanning image to obtain a verification medical image;
based on the validated medical image, a radiation dose received by the tumor planning target volume is determined.
12. The image processing method according to claim 11, wherein the auxiliary medical image includes at least a 4D-CT image or a 3D-CT image.
13. An image processing apparatus, characterized in that the apparatus comprises:
a tumor plan target volume determination module configured to acquire a first medical scan image of a target object and determine a tumor plan target volume of the target object based on the first medical scan image;
the track determining module is configured to acquire a second medical scanning image of the target object and determine a real-time motion track of a tumor related area of the target object based on the second medical scanning image;
An execution module configured to perform at least one of the following operations based on the tumor planned target volume and/or the real-time motion trajectory: target volume confirmation, beam opening and closing, and dose estimation;
wherein,
the first medical scan image and the second medical scan image comprise magnetic resonance images acquired based on multi-planar dynamic imaging of a portal direction, the portal direction comprising a plurality of; for any field direction, the imaging plane comprises at least a first plane along an incident path of the field and a second plane perpendicular to the incident path; the first plane and the second plane are perpendicular.
14. The image processing apparatus according to claim 13, wherein the number of the field directions is the same and the directions are the same in the acquisition process of the first medical scan image and the second medical scan image.
15. The image processing device of claim 13, wherein to determine a tumor-planned target, the target determination module is configured to:
determining the target object tumor region using machine learning, preferably using deep learning, more preferably using a deep neural network model to process the first medical scan image;
Expanding the tumor boundary of the tumor region to obtain the tumor planning target region; wherein the tumor margin is flared no more than 8 millimeters.
16. The image processing apparatus of claim 15, wherein the tumor boundary is flared a distance of 3-8 millimeters.
17. The image processing apparatus according to claim 13, wherein the imaging plane includes, for any of the portal directions, a first plane along an incident path of the portal, a second plane perpendicular to the incident path, and a third plane along the incident path of the portal; the first plane, the second plane and the third plane are perpendicular to each other.
18. The image processing apparatus according to claim 17, wherein an angle between the first plane or the third plane and the incident path is not more than 45 °; alternatively, the angle between the first plane or the third plane and the incident path is not more than 30 °; alternatively, the angle between the first plane or the third plane and the incident path is not more than 10 °; alternatively, the angle between the first plane or the third plane and the incident path is not more than 1 °.
19. The image processing apparatus according to claim 17, wherein a second angle between a perpendicular to the second plane and the incident path is not more than 45 °; or, a second included angle between the perpendicular of the second plane and the incident path is not more than 10 degrees; or, a second included angle between the perpendicular of the second plane and the incident path is not more than 10 degrees; alternatively, the second angle between the perpendicular to the second plane and the incident path is no more than 1 °.
20. The image processing apparatus of claim 13, wherein to perform the target validation operation, the execution module is configured to:
determining whether the real-time motion trail of the tumor-associated region exceeds the tumor planning target region;
if yes, confirming the tumor planned target area as a target area;
if not, updating the tumor planning target region based on the second medical scanning image.
21. The image processing apparatus of claim 20, wherein to update the tumor planning target volume based on the second medical scan image, the execution module is configured to:
processing the second medical scanning image by using a deep neural network model, determining an updated tumor region of the target object and updating an outer boundary of the tumor region;
Updating the tumor plan target region based on updating the tumor region and the tumor region outline border.
22. The image processing apparatus according to claim 13, wherein to perform the beam opening and closing operation, the execution module is configured to:
determining whether the real-time motion trail of the tumor-associated region exceeds the tumor planning target region;
if the real-time motion trail is positioned in the tumor planned target area, generating a first control signal and transmitting the first control signal to a ray generation device so that the ray generation device emits a ray bundle based on the first control signal;
and if the real-time motion trail is positioned outside the tumor planned target area, generating a second control signal and transmitting the second control signal to the ray generation equipment so that the ray generation equipment stops emitting ray beams based on the second control signal.
23. The image processing apparatus of claim 13, wherein to perform the dose estimation operation, the execution module is configured to:
acquiring an auxiliary medical scanning image related to the tumor of the target object;
registering and fusing the second medical scanning image and the auxiliary medical scanning image to obtain a verification medical image;
Based on the validated medical image, a radiation dose received by the tumor planning target volume is determined.
24. The image processing apparatus according to claim 23, wherein the auxiliary medical image comprises at least a 4D-CT image or a 3D-CT image.
25. An apparatus, comprising: memory, a processor and a computer program stored on the memory and executable on the processor, which when executed by the processor, implements the steps of the image processing method according to any of claims 1-12.
26. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, implements the steps of the image processing method according to any one of claims 1-12.
27. A medical system, characterized in that the medical device comprises an image processing apparatus as claimed in any one of claims 13-24.
CN202310532321.5A 2023-05-11 2023-05-11 Image processing method, device, system and storage medium Pending CN117815573A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310532321.5A CN117815573A (en) 2023-05-11 2023-05-11 Image processing method, device, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310532321.5A CN117815573A (en) 2023-05-11 2023-05-11 Image processing method, device, system and storage medium

Publications (1)

Publication Number Publication Date
CN117815573A true CN117815573A (en) 2024-04-05

Family

ID=90515976

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310532321.5A Pending CN117815573A (en) 2023-05-11 2023-05-11 Image processing method, device, system and storage medium

Country Status (1)

Country Link
CN (1) CN117815573A (en)

Similar Documents

Publication Publication Date Title
US10300305B2 (en) Image guidance for radiation therapy
US10188874B2 (en) Three dimensional localization and tracking for adaptive radiation therapy
CN109464757B (en) Method, system, device and storage medium for determining position of target object
EP3681600B1 (en) Radiotherapy treatment plan optimization workflow
US10152790B2 (en) Three dimensional localization of a moving target for adaptive radiation therapy
JP6886565B2 (en) Methods and devices for tracking surface movements
JP2018522651A (en) System and method for tracking a target using quality indicators during radiation therapy
WO2020001408A1 (en) Radiotherapy plan formulating method, radiotherapy plan system, and radiotherapy system
EP3634577B1 (en) Systems and methods of accounting for shape change during radiotherapy
US11786204B2 (en) Automatically-registered patient fixation device images
CN115485019A (en) Automatically planned radiation-based treatment
CN115666717A (en) Method and apparatus for deriving and using virtual volume structures to predict potential collisions in delivering therapeutic radiation
US9014454B2 (en) Method and apparatus pertaining to images used for radiation-treatment planning
US9919163B2 (en) Methods, systems and computer readable storage media for determining optimal respiratory phase for treatment
US20210121715A1 (en) Cardiac ablation using an mr linac
CN117815573A (en) Image processing method, device, system and storage medium
WO2022052434A1 (en) X-ray imaging system for radiation therapy
US20230241416A1 (en) X-ray imaging system for radiation therapy

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication