CN116490897A - Real-time image guiding method, device and system and radiotherapy system - Google Patents

Real-time image guiding method, device and system and radiotherapy system Download PDF

Info

Publication number
CN116490897A
CN116490897A CN202080107396.3A CN202080107396A CN116490897A CN 116490897 A CN116490897 A CN 116490897A CN 202080107396 A CN202080107396 A CN 202080107396A CN 116490897 A CN116490897 A CN 116490897A
Authority
CN
China
Prior art keywords
image
target
real
target object
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080107396.3A
Other languages
Chinese (zh)
Inventor
闫浩
王中亚
李久良
李金升
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Our United Corp
Original Assignee
Our United Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Our United Corp filed Critical Our United Corp
Publication of CN116490897A publication Critical patent/CN116490897A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Radiation-Therapy Devices (AREA)

Abstract

The application provides a real-time image guiding method, device and system and a radiotherapy system, and belongs to the technical field of radiotherapy. The image guiding system can conduct reliable real-time image guiding on the target object based on the acquired target reference image and the real-time projection image of the target object. Because the target reference image is acquired after positioning is completed, the accuracy of guiding the real-time image of the target object based on the target reference image and the real-time projection image is higher.

Description

Real-time image guiding method, device and system and radiotherapy system Technical Field
The disclosure relates to the technical field of radiotherapy, in particular to a real-time image guiding method, a real-time image guiding device, a real-time image guiding system and a radiotherapy system.
Background
In a radiation therapy scenario, image guided radiation therapy (image guided radiation therapy, IGRT) techniques may be employed to locate and track the position of a target object (e.g., a tumor of a patient) in real-time to guide radiation therapy.
In the related art, an IGRT system for implementing the IGRT technology can acquire two-dimensional projection images of a target object under two different angles in a treatment process, register the two-dimensional projection images with reference images under corresponding angles respectively to determine two-dimensional offset amounts, and finally calculate a three-dimensional offset amount of the target object based on the determined two-dimensional offset amounts to realize tracking of the position of the target object. The reference image may be an image generated by an IGRT system reconstructed from an electronic computed tomography (computed tomography, CT) image of the target object at the time of generating the treatment plan.
However, the image guidance method of the related art has low accuracy of tracking the position and poor flexibility.
Disclosure of Invention
The embodiment of the disclosure provides a real-time image guiding method, a device and a system and a radiotherapy system, which can solve the problems of lower accuracy and poorer flexibility of tracking positions in the related technology. The technical scheme is as follows:
in one aspect, a real-time image guidance method is provided, the method including:
acquiring a target reference image of a target object, wherein the target reference image is an image determined based on an image acquired by an image guiding device under the condition that the positioning of the target object is completed;
acquiring a real-time projection image of the target object by adopting the image guiding device;
and guiding the target object in real time according to the target reference image and the real-time projection image.
In another aspect, there is provided a real-time image guidance apparatus, the apparatus including:
the first acquisition module is used for acquiring a target reference image of a target object, wherein the target reference image is an image determined based on an image acquired by the image guiding device under the condition that the positioning of the target object is completed;
The second acquisition module is used for acquiring a real-time projection image of the target object by adopting the image guiding device;
and the image guiding module is used for guiding the real-time image of the target object according to the target reference image and the real-time projection image.
In yet another aspect, a real-time image guidance system is provided, the real-time image guidance system comprising: an image guidance device, a processor, and a memory;
the image guiding device is used for acquiring images, and the memory stores instructions which are loaded and executed by the processor to realize the real-time image guiding method according to the aspect.
In yet another aspect, a storage medium having instructions stored therein that, when executed on a processing component, cause the processing component to perform the real-time image guidance method as described in the above aspects is provided.
In yet another aspect, there is provided a radiation therapy system comprising: a patient support device, a mainframe, and a real-time image guidance system; the real-time image guidance system is the system described in the above aspect;
the host is respectively connected with the real-time image guiding system and the patient support device, the real-time image guiding system is used for sending the determined target offset of the target object to the host, and the host is used for adjusting the position of the patient support device based on the target offset.
The technical scheme provided by the embodiment of the disclosure has at least the following beneficial effects:
in summary, the embodiments of the present disclosure provide a method, apparatus and system for guiding real-time images, and a radiation therapy system. The image guiding system can conduct reliable real-time image guiding on the target object based on the acquired target reference image and the real-time projection image of the target object. Because the target reference image is acquired after positioning is completed, the accuracy of guiding the real-time image of the target object based on the target reference image and the real-time projection image is higher.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and other drawings may be obtained according to these drawings without inventive effort for a person of ordinary skill in the art.
FIG. 1 is a schematic diagram of a radiation therapy system provided in an embodiment of the present disclosure;
FIG. 2 is a flow chart of a real-time image guidance method provided by an embodiment of the present disclosure;
FIG. 3 is a flow chart of another real-time image guidance method provided by an embodiment of the present disclosure;
FIG. 4 is a flow chart of a method of determining a target reference image provided by an embodiment of the present disclosure;
FIG. 5 is a schematic view of a marker provided in an embodiment of the present disclosure;
FIG. 6 is a flowchart of an image guidance method provided by an embodiment of the present disclosure;
FIG. 7 is a flowchart of another image guidance method provided by an embodiment of the present disclosure;
FIG. 8 is a flow chart of another real-time image guidance method provided by an embodiment of the present disclosure;
FIG. 9 is a flow chart of yet another real-time image guidance method provided by an embodiment of the present disclosure;
FIG. 10 is a block diagram of a real-time image guidance apparatus provided by an embodiment of the present disclosure;
FIG. 11 is a block diagram of another real-time image guidance apparatus provided by an embodiment of the present disclosure;
fig. 12 is a block diagram of a real-time image guidance system provided by an embodiment of the present disclosure.
Specific embodiments of the present disclosure have been shown by way of the above drawings and will be described in more detail below. These drawings and the written description are not intended to limit the scope of the disclosed concepts in any way, but rather to illustrate the disclosed concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
For the purposes of clarity, technical solutions and advantages of the present disclosure, the following further details the embodiments of the present disclosure with reference to the accompanying drawings.
Fig. 1 is a schematic structural view of a radiation therapy system provided in an embodiment of the present disclosure. As shown in fig. 1, the radiation therapy system can include a patient support 01, a mainframe 02, a real-time image guidance system 03, and a radiation therapy device 04.
Alternatively, the patient support apparatus 01 may be a treatment couch, such as a three-dimensional treatment couch, a six-dimensional treatment couch, as shown in fig. 1, although other apparatus for supporting a patient, such as a treatment chair, may be used. The host 02 may be a control device. The image guidance system 03 may be an IGRT system. The host 02 may be in communication with the patient support apparatus 01 and the image guidance system 03 and established, which may be a wired connection as shown in fig. 1 or may be a wireless connection. During radiation therapy, the image guidance system 03 can track the position of a target object (e.g., a tumor of a patient) in real time by adopting an IGRT technology and send the offset of the target object to the host 02, and the host 02 can flexibly adjust the position of the patient support device 01 based on the received offset so as to realize reliable real-time image guidance on the patient.
Alternatively, the real-time image guidance system 03 may comprise an image guidance device 031, and the image guidance device 031 may be a cone-beam electronic scanning (cone beam computed tomography, CBCT) device. That is, the image guidance means 031 may be used to acquire CBCT images of the target object.
For example, referring to fig. 1, the image guidance apparatus 031 may include: one or more sets of image acquisition assemblies, each of which may include a detector 0311 and a bulb 0312 (fig. 1 only schematically illustrates one set of image acquisition assemblies) disposed opposite to each other. Wherein the bulb 0312 may emit radiation, e.g. a cone beam, the detector 0311 may receive the radiation emitted by the bulb 0312, and the image-guiding means 031 may generate a two-dimensional projection image of the target object based on the radiation received by the detector 0311.
Alternatively, the bulb 0312 may be a bulb capable of emitting Kilovolt (KV) level X-rays, and the detector 0311 may be a flat panel detector. Accordingly, the two-dimensional projection image acquired by the image guidance device 031 may be a KV-level X-projection image.
With continued reference to fig. 1, the radiation therapy device 04 can have a plurality of therapy sources 041 disposed thereon, each therapy source 041 can emit radiation, and the radiation emitted by the plurality of therapy sources 041 can be rotated about an axis of rotation and focused to a beam focus to treat a target object. Alternatively, the therapeutic source 041 may be a gamma-ray source, and the radiation emitted by the therapeutic source 041 is a gamma-ray; alternatively, the therapeutic source 041 may be an X-ray source, and accordingly, the radiation emitted by the therapeutic source 041 is X-rays. The image guidance device 031 described in the above embodiment may be provided in the radiotherapy apparatus 04.
In the radiotherapy process, the principle that the real-time image guiding system 03 tracks a target object by adopting the IGRT technology is as follows:
an image-guided device is used to acquire a reference image and a two-dimensional projection image of the target object, and the two images are registered. Registering two images may refer to taking a designated one of the images as a reference image and the other image as an image to be registered, where the purpose of registering is to make all points on the image to be registered and the reference image consistent. Because the two-dimensional projection image of the target object is generally an image acquired in real time in the radiotherapy process, namely an image acquired on site, the two-dimensional projection image of the target object can be used as an image to be registered.
In the related art, the reference image is generally a digitally reconstructed radiation (digitally reconstructed radio, DRR) image generated based on CT image reconstruction of the target object, but because the CT image is generally taken before radiotherapy is performed to make a treatment plan, that is, the CT image is acquired before positioning, and because positioning operation is generally performed on the patient before generating the DRR image, errors are introduced, the accuracy of the DRR image generated based on the CT image is lower, and thus the registration accuracy is lower. In the embodiment of the present disclosure, when the tracking of the position of the target object is completed based on the positioning, the reference image is an image determined by the image acquired by the image guiding device 031, for example, a DRR image generated by CBCT image reconstruction of the target object, so that errors caused by positioning when the DRR image is generated by the CT image in the related art are avoided, and thus the precision of the generated DRR image is higher, and correspondingly, the registration precision is higher.
In addition, in order to obtain the three-dimensional offset of the target object, in the related art, two-dimensional offsets obtained by registration under different photographing angles need to be acquired first. Thus, if only one set of image acquisition components is included, the set of image acquisition components needs to be controlled to acquire a two-dimensional projection image of a target object under one angle and then acquire a two-dimensional projection image of the target object under another angle. The two-dimensional projection images are difficult to acquire in the same state, the consistency is poor, and the corresponding registration accuracy is poor. If the two groups of image acquisition components are included, larger space is occupied, and correspondingly, the treatment space is reduced. For the embodiment of the disclosure, only the two-dimensional projection image of the target object under one angle may be obtained, i.e. the three-dimensional offset of the target object is obtained, in other words, the image guiding device 031 only needs to include a set of image capturing components. Therefore, the real-time position tracking of the target object is guaranteed, the treatment space is not influenced, and the higher registration accuracy can be guaranteed.
Fig. 2 is a flowchart of a real-time image guidance method according to an embodiment of the present disclosure, which may be applied to the real-time image guidance system 03 shown in fig. 1. As shown in fig. 2, the method may include:
step 201, a target reference image of a target object is acquired.
The target reference image is an image determined based on the image acquired by the image guiding device under the condition that the positioning of the target object is completed.
Alternatively, the target reference image may be a three-dimensional image or a two-dimensional image.
Step 202, acquiring a real-time projection image of a target object by adopting an image guiding device.
Alternatively, the real-time projection image may be a two-dimensional projection image of the target object captured by the image guidance device at the target capturing angle during the radiotherapy.
And 203, guiding the real-time image of the target object according to the target reference image and the real-time projection image.
Alternatively, the image guidance system may perform an image registration operation based on the target reference image and the real-time projection image to obtain a target offset of the target object, and perform real-time image guidance on the target object based on the determined target offset, e.g., reliably adjust the position of the target object.
In summary, the embodiments of the present disclosure provide a real-time image guidance method. The image guidance system can reliably guide the real-time image of the target object based on the acquired target reference image and the real-time projection image of the target object. Because the target reference image is acquired after positioning is completed, the accuracy of guiding the real-time image of the target object based on the target reference image and the real-time projection image is higher.
Alternatively, the target reference image acquired by the real-time image guidance system may be a three-dimensional image, or may be a two-dimensional image. The following examples illustrate the two ways, respectively:
as an alternative implementation: the target reference image is a three-dimensional image. Fig. 3 is a flowchart of another real-time image guidance method provided by an embodiment of the disclosure, as shown in fig. 3, the method may include:
and 301, under the condition that the positioning of the target object is completed, determining a target three-dimensional image reconstructed by the target two-dimensional projection images of the target object under different shooting angles as a target reference image.
The two-dimensional projection images of the target object under different shooting angles can be acquired by an image guiding device. Alternatively, the target three-dimensional image may be a CBCT image.
For the target reference image, it can be determined by the following two ways:
in the first mode, after the target object is positioned, the real-time image guiding system can firstly adopt the image guiding device to re-shoot two-dimensional images under different angles and reconstruct to obtain a target reference image. Referring to the method flowchart shown in fig. 4, step 301 may include:
in step 3011, under the condition that the positioning of the target object is completed, an image guiding device is adopted to acquire target two-dimensional projection images of the target object under different shooting angles.
For example, if the image guiding device includes a set of image capturing components, the image guiding device may control the bulb tubes in the set of image capturing components to emit rays at different capturing angles, and accordingly, the detector may receive the rays at different capturing angles, and the image guiding device may generate a plurality of two-dimensional projection images of the target at different capturing angles based on the rays received by the detector at different capturing angles. Or, the image guiding device may also include a plurality of groups of image acquisition components, and accordingly, a plurality of target two-dimensional projection images may be simultaneously acquired by the plurality of groups of image acquisition components under different shooting angles, so that efficiency of acquiring the plurality of target two-dimensional projection images under different shooting angles may be improved.
Alternatively, to ensure the quality of the target reference image obtained by the subsequent reconstruction, the real-time image guidance system may employ the image guidance device to perform a full scan (i.e., one scan) within the radiotherapy apparatus 04 to obtain a plurality of target two-dimensional projection images at different imaging angles.
In addition, the real-time image guiding system can acquire target two-dimensional projection images of the target object under different shooting angles by adopting the image guiding device when receiving an imaging instruction sent by the host.
And 3012, reconstructing a target two-dimensional projection image of the target object under different shooting angles.
After the target two-dimensional projection images of the target object under different shooting angles are acquired, the real-time image guidance system can reconstruct and generate a target three-dimensional image of the target object based on the target two-dimensional projection images.
And 3013, determining the reconstructed three-dimensional image of the target as a target reference image.
Finally, the real-time image guidance system can determine the reconstructed target three-dimensional image as a target reference image.
In the first mode, the positioning of the target object may be performed by two-dimensional image and two-dimensional image registration, or may be performed by three-dimensional image and three-dimensional image registration.
Taking the two-dimensional image and the two-dimensional image registration method for positioning the target object as an example, the method can further comprise:
and A1, acquiring alternative two-dimensional projection images of the target object under at least two shooting angles by adopting an image guiding device.
Optionally, in the process of positioning the patient, the real-time image guidance system may use the image guidance device to shoot the target object under at least two shooting angles, so as to obtain an alternative two-dimensional projection image of the target object under at least two shooting angles. The acquisition method may refer to the description of step 3011, and will not be described in detail here.
And A2, registering the alternative two-dimensional projection image under at least two shooting angles with a planned digital reconstructed radiation DRR image reconstructed under at least two shooting angles based on the planned image.
Alternatively, the planning image may be an image obtained by scanning the target object with the planning image acquisition device when making a treatment plan before radiation treatment, and the real-time image guidance system may acquire the planning image sent by the planning image acquisition device. For example, the planning image acquisition apparatus may transmit the acquired planning image to the real-time image guidance system after receiving an image acquisition instruction transmitted from the real-time image guidance system. Alternatively, the planning image acquisition apparatus may transmit the acquired planning image to the real-time image guidance system upon receiving an image transmission instruction issued by the host computer.
Alternatively, the planning image may be a CT image or a nuclear magnetic resonance (magnetic resonance, MR) image. That is, the above-described planning image acquisition apparatus may be a CT apparatus or an MR apparatus. However, since both the CT image and the MR image are three-dimensional images, the real-time image guidance system may reconstruct and generate DRR images at least two imaging angles based on the acquired planning images in order to register with the two-dimensional projection images acquired in step A1. And the at least two shooting angles are the same as the at least two shooting angles in the step A1.
The real-time image guidance system may then image register the alternative two-dimensional projection image acquired in step A1 at each of the photographing angles with the planned DRR image reconstructed at the corresponding photographing angle based on the planned image to determine whether the positioning is completed. Optionally, taking a shooting angle as an example, the real-time image guidance system may take a planned DRR image under the shooting angle as a reference image, take an alternative two-dimensional projection image under the shooting angle as an image to be registered, compare coordinates of points in the two images, and determine whether positioning is completed based on a comparison result.
And A3, finishing positioning the target object when the registration result meets the registration condition.
Alternatively, the registration conditions may be: the position deviation of the target object in the two registered images is smaller than or equal to a deviation threshold value. For example, if the deviation threshold may be 0, and the registration condition is that the deviation of the positions of the target object in the two registered images is equal to the deviation threshold, then corresponding, the registration result satisfying the registration condition may refer to: the position of the target object in the registered two images is identical.
In the application scenario described in the embodiments of the present disclosure, the deviation is generally a three-dimensional deviation, and the above is only schematically described.
If the registration result of image registration between the candidate two-dimensional projection image at each shooting angle and the planned digitally reconstructed radiation DRR image reconstructed at the corresponding shooting angle based on the planned image after the step A2 is executed meets the above registration condition, the real-time image guidance system may determine that the positioning is completed at this time, and then may further determine that the candidate two-dimensional projection images at the at least two shooting angles acquired in the step A1 are: two-dimensional projection images of the target at different angles of capture are displayed for execution of step 3011. I.e. after performing step A3, step 3011 may continue immediately.
Of course, if the registration result does not satisfy the registration condition, steps A1 to A3 may be continuously performed until the registration result satisfies the registration condition as an example.
In a second manner, the three-dimensional image that participates in the registration reconstruction when the target object is located is directly determined as the target reference image, and for this purpose, before the target reference image of the target object is acquired, the method may further include:
and B1, acquiring an alternative three-dimensional image of the target object by adopting an image guiding device.
Wherein the candidate three-dimensional image may be an image reconstructed based on reference two-dimensional projection images of the target object at different photographing angles.
In other words, the real-time image guiding system can firstly acquire the reference two-dimensional projection images of the target object under different shooting angles by adopting the image guiding device in the positioning process, and then reconstruct the reference two-dimensional projection images of the target object under different shooting angles to obtain the alternative three-dimensional image of the target object.
Optionally, the method of acquiring the reference two-dimensional projection image of the target object under different shooting angles by using the image guidance device may refer to step 3011, which is not described herein. And the alternative three-dimensional image may be a CBCT image.
And B2, performing image registration on the alternative three-dimensional image and the planning image.
Since the alternative three-dimensional image is obtained in step B1, and the planning image is also a three-dimensional image, the real-time image guidance system can directly prepare the alternative three-dimensional image and the planning image at this time. The registration method may refer to the description of step A2, and will not be described in detail herein.
And B3, finishing positioning the target object when the registration result meets the registration condition.
In the same way as step A3, when the registration result satisfies the registration condition, the real-time image guidance system may determine that the positioning is completed, and may continue to execute step B4 described below. Of course, if the registration result does not satisfy the registration condition, steps B1 to B3 may be continuously performed until the registration result satisfies the registration condition, for example.
Step B4, determining the candidate three-dimensional image as the target reference image (step B4 is another specific example of step 301).
Finally, the real-time image guidance system can determine the candidate three-dimensional image used for registration as the target three-dimensional image. That is, by means of steps B1 to B4, the target reference image after the positioning is completed can be directly determined. It should be noted that, in the second mode, steps B1 to B3 are to complete the positioning of the target object by using the three-dimensional image and the three-dimensional image registration.
Step 302, acquiring a real-time projection image of a target object by adopting an image guiding device.
Based on the description of the above embodiments, after the target reference image of the target object is acquired, the representative positioning is completed, and at this time, the patient can be sent into the treatment space of the radiotherapy apparatus to perform radiotherapy on the patient.
Although the patient is sent into the treatment space after the positioning is completed, the position of the target object may be shifted due to unavoidable factors such as patient movement or patient respiration, cough, etc. during the radiotherapy, so in order to ensure the treatment precision, in order to avoid misirradiation to other normal tissues, it is necessary to track the position of the target object in real time by adopting an image guiding technology during the radiotherapy, so as to adjust the position of the patient in real time, and ensure that the focus of the therapeutic beam and the therapeutic target point of the target object can be aligned in real time. In order to track the position of the target object, the real-time image guidance system needs to acquire the current position of the target object.
For example, a real-time image guidance system may acquire a real-time projection image of a target object, i.e., a two-dimensional projection image of the target object, at a target photographing angle using an image guidance device. The acquisition method can be described with reference to the above embodiments. In addition, the real-time image guidance system may start to acquire the real-time projection image of the target object by using the image guidance device after receiving the imaging instruction sent by the host.
Step 303, obtaining a target shooting angle of the real-time projection image.
In order to facilitate the subsequent image registration, the real-time image guidance system may also acquire a shooting angle at which the image guidance device is located when acquiring the real-time projection image, i.e. a target shooting angle.
And 304, acquiring a target digital reconstruction radiation DRR image of the target three-dimensional image under a target shooting angle.
In combination with the above steps, since the image which can reflect the current position of the target object and is acquired in real time during the treatment process is a two-dimensional image, and the image which is acquired after the positioning is completed before is a three-dimensional image, before registration, the real-time image guidance system also needs to reconstruct and generate a DRR image of the target object under the target shooting angle based on the target three-dimensional image. The position of the target object in the DRR image is: when the target point of the target object is aligned with the focal point of the therapeutic beam, the target object should be in a reference position.
It should be noted that, for each two-dimensional projection image at each shooting angle, a DRR image at the same angle may be corresponding. That is, the real-time image guidance system can generate a DRR image at a target photographing angle based on a target three-dimensional image of a target object, regardless of the target photographing angle.
Because the target three-dimensional image is an image obtained after the positioning is completed, compared with the prior art, the DRR image is generated based on the planning image before the positioning, and the DRR image generated based on the target three-dimensional image can more accurately represent the reference position of the target object, so that the registration result is improved.
Optionally, a marker may be placed on or in the patient's body surface after the treatment plan is made, such as by attaching the marker to the patient's body surface or implanting the marker into the patient's body.
Alternatively, the marker may be a metal marker (abbreviated as "gold marker") made of a metal material. Because the imaging effect of the marker is better, the registration accuracy can be improved during registration. In addition, at least three non-collinear markers may be provided. Therefore, the real-time image guiding system can refer to the position registration images of the plurality of markers at different shooting angles, and the registration accuracy is further improved.
By way of example, assume that the target object is a tumor located in the head and three non-collinear markers are provided in total. Then referring to fig. 5, one marker may be placed at each of the patient's two temples and nose tip. If the target object is located in the body, a marker may be placed at the spine of the patient.
Because the marker is arranged before the positioning, the marker can be included in the target reference image acquired after the positioning is completed and the real-time projection image acquired in the positioning process. To improve registration accuracy, step 304 may further include:
and performing image processing on the target three-dimensional image to obtain a target three-dimensional image only comprising the marker, and then acquiring a target DRR image of the target three-dimensional image only comprising the marker under a target shooting angle. I.e. the DRR image used for the subsequent registration may be a DRR image comprising only markers.
It should be noted that the real-time image guidance system may store at least two target reference images. Wherein a set of target reference images may include only markers, and the target reference images including only markers may be used for subsequent real-time image guidance. Another set of target reference images may include markers and other information (e.g., bone tissue) that may be used for display, such as may be used for display to a treating physician via a host computer.
And 305, guiding the target object in real time according to the target DRR image and the real-time projection image.
Alternatively, as an alternative implementation, referring to the method flowchart shown in fig. 6, step 305 may include:
Step 3051A, image registration is performed on the target DRR image and the real-time projection image.
After the target DRR image and the real-time projection image are obtained, the real-time image guiding system can take the target DRR image as a reference image, take the real-time projection image as an image to be registered, and compare coordinates of points in the two images to determine the offset of the target object.
An alternative embodiment: if the target object includes a marker, step 3051A may include:
dividing the markers in the target DRR image and the real-time projection image respectively, and registering the markers in the divided target DRR image and the markers in the real-time projection image.
Alternatively, the segmentation method may be: the real-time image guidance system can perform image blurring processing on an image to be segmented (such as a target DRR image and a real-time projection image) first so as to blur a marker in the image to be segmented, and enable the marker to be blended into the background of the image to be segmented, so that a new image is obtained. Then the real-time image guiding system can adopt the image to be segmented to subtract the new image obtained after the image blurring process to complete the segmentation of the marker. Image subtraction may refer to: the pixel value of a pixel point in the new image obtained by subtracting the blurring process from the pixel value of the pixel point in the image to be segmented.
Or, the real-time image guidance system may first acquire CT values of each target object and the marker in the image to be segmented, where the unit of the CT value is Hu, and may be used to measure the absorptivity of the human tissue to the radiation. And a threshold value (simply referred to as a reference threshold value) for CT filtering can be preset in the real-time image guidance system. Then, the real-time image guidance system can perform image normalization processing on the image to be segmented based on CT values of each target object and the marker and a reference threshold value, and complete marker segmentation processing. For example, the real-time image guidance system may determine a magnitude relationship between the CT value of each target object and the marker and the reference threshold, and for a CT value corresponding to a target object having a CT value greater than the reference threshold, the real-time image guidance system does not change the CT value. For a CT value corresponding to a target object whose CT value is less than the reference threshold, the real-time image guidance system may set it as a first threshold and may set the CT value of the marker as a second threshold.
Or acquiring a target shooting angle of the image to be segmented, acquiring a reconstructed three-dimensional image, and acquiring a target DRR image of the reconstructed three-dimensional image under the target shooting angle. Optionally, the reconstructed three-dimensional image may be filtered to obtain a three-dimensional image including only the marker, and a target DRR image of the three-dimensional image including only the marker under the target shooting angle is obtained. Then, one or more reference regions of interest (region of interest, ROI) are constructed in the target DRR image, the reference ROI comprises one or more markers, and the one or more reference ROIs are mapped in the image to be segmented, so that one or more reference target ROIs are correspondingly obtained, and segmentation is completed.
Optionally, the image registration of the markers in the segmented target DRR image and the markers in the real-time projection image may be: the target point of the contrast marker is located in the target DRR image and the real-time projection image. The target point may be the center point of the marker.
Another alternative embodiment: if the target object includes bony landmark tissue and markers, step 3051A may include: dividing the markers in the target DRR image and the real-time projection image respectively, and carrying out first target registration on the markers in the divided target DRR image and the markers in the real-time projection image. And performing a second target registration of the bony landmark tissue in the target DRR image and the bony landmark group fabric in the real-time projection image. Reference may be made to the above embodiments for comparison, and no further description is given here.
Step 3052A, determining a target offset of the target object according to the registration result.
Alternatively, if image registration is performed based on only the markers, the real-time image guidance system may directly determine the offset obtained by registration as the target offset of the target object.
Optionally, if image registration is performed based on the markers and image registration is performed based on the bony marker tissue, the real-time image guidance system may determine a first reference offset of the target object according to the registration result of the first target registration, determine a second reference offset of the target object according to the registration result of the second target registration, and calculate a target offset of the target object based on the first offset, the weight value of the first offset, the second offset, and the weight value of the second offset.
Alternatively, the real-time image guidance system may be preset with a weight value of the first offset and a weight value of the second offset. After the first offset and the second offset are obtained, the image guidance system may calculate the actual offset of the target object by taking the first offset, the weight value of the first offset, the second offset, and the weight value of the second offset as parameters through a weighted summation manner. And the offset of the target object is determined by combining the two modes, so that the reliability and the precision are good.
It should be noted that, if the real-time image guidance system cannot register to obtain the offset based on the image at a certain shooting angle, the result may be selected not to be output, so as to avoid risk caused by erroneous output.
Step 3053A, adjusting the position of the target object according to the target offset.
Finally, the real-time image guiding system can flexibly adjust the position of the target object according to the target offset, so that the treatment precision is ensured. For example, after the target offset is obtained through registration, the real-time image guidance system can send the target offset to the host, so that the host can realize real-time flexible adjustment of the position of the patient, and the treatment precision is ensured.
Optionally, when the target offset is smaller, the influence of the offset on the accuracy of radiotherapy may be very small, so after the target offset is obtained, the real-time image guidance system may detect whether the target offset is greater than the offset threshold, that is, detect whether the movement deviation of the patient is greater. If the offset threshold is greater than the offset threshold, the real-time image guidance system may further output the offset to the host for adjusting the position. Alternatively, the host may detect whether the target offset is greater than the offset threshold, and when the target offset is greater than the offset threshold, then position the target object. The embodiments of the present disclosure are not limited in this regard.
Alternatively, as another alternative implementation, referring to the method flowchart shown in fig. 7, step 305 may include:
step 3051B, determining the image scaling factor according to the target DRR image and the real-time projection image.
Alternatively, the live image guidance system may determine the image magnification of the live projection image by comparing the target DRR image with the live projection image. In this way, only a real-time projection image at one shooting angle is required to be acquired.
For example, taking a shooting angle as an example, if the target object moves down at the shooting angle, the current real-time projection image at the shooting angle corresponds to the image reduction before the object does not move down. If the target object moves up under the shooting angle, the current real-time projection image under the shooting angle corresponds to the image magnification before the object does not move down. Thus, the image scaling factor can be determined.
Optionally, if the target object further includes a marker, the step 3051B may include: dividing the markers in the target DRR image and the real-time projection image respectively, and determining the image scaling factor according to the markers in the divided target DRR image and the markers in the real-time projection image.
Step 3052B, determining a target offset of the target object according to the image scaling factor.
After determining the image scaling factor, the real-time image guidance system can determine the offset of the target object in the contraction direction based on the image scaling factor. Then, the other two offsets determined under the two-dimensional coordinate system can be combined to comprehensively obtain the target offset of the target object.
Step 3053B, adjusting the position of the target object according to the target offset.
This step can be referred to above in step 3053A, and will not be described in detail here.
As another alternative implementation, the target reference image is a two-dimensional image. Fig. 8 is a flowchart of another real-time image guidance method provided by an embodiment of the disclosure, as shown in fig. 8, the method may include:
in step 401, when the positioning of the target object is completed, the target two-dimensional projection image of the target object under different shooting angles is determined as a target reference image.
The two-dimensional projection images of the target object under different shooting angles can be acquired by an image guiding device. That is, the real-time image guidance system can acquire two-dimensional projection images of the target of the image guidance apparatus at different photographing angles.
Step 402, acquiring a real-time projection image of a target object by adopting an image guiding device.
This step may be described with reference to step 302 above, and will not be described in detail herein.
Step 403, obtaining a target shooting angle of the real-time projection image.
This step may be described with reference to step 303 above, and will not be described in detail herein.
And step 404, acquiring a target two-dimensional projection image of the target object under the target shooting angle.
After determining the target shooting angle of the obtained real-time projection image, the real-time image guiding system can further determine the target two-dimensional projection image under the target shooting angle in the obtained target two-dimensional projection images under different shooting angles.
And 405, determining the image scaling factor according to the target two-dimensional projection image and the real-time projection image under the target shooting angle.
This step may be described with reference to step 3051B above, and will not be described again here.
And similarly, if the target object includes a marker, then, correspondingly, step 405 may include:
dividing the target two-dimensional projection image and the marker in the real-time projection image under the target shooting angle respectively, and determining the image scaling factor according to the divided marker in the target two-dimensional projection image and the marker in the real-time projection image under the target shooting angle.
Step 406, determining a target offset of the target object according to the image scaling factor.
This step may be described with reference to step 3052B above, and will not be described in detail herein.
Step 407, adjusting the position of the target object according to the target offset.
This step may be described with reference to step 3053B above, and will not be described again here.
Optionally, under the condition that positioning for the target object is completed, the real-time image guiding system can also acquire the positioning registration type of the target object in the positioning stage, and flexibly determine the target reference image based on the positioning registration type. I.e. a further real-time image guidance method as shown in fig. 9. As shown in fig. 9, the method may include:
in step 501, when the positioning of the target object is completed, the positioning registration type of the target object is obtained.
Alternatively, in the embodiments of the present disclosure, the real-time image guidance system may first obtain the registration type of the target object when the target object is located. The registration type may be a type of image registration of the registration phase. Based on the above embodiment, the localization registration type may be 2-dimensional (D-2D) registration or 3D-3D registration.
Step 502, determining a target reference image based on the image acquired by the image guiding device according to the positioning registration type.
If the positioning registration type package is 2D-2D registration, in combination with the description of the embodiment, the real-time image guidance system may first acquire the target two-dimensional projection images of the target object under different shooting angles by using the image guidance device, then reconstruct the target two-dimensional projection images of the target object under different shooting angles, and determine the reconstructed target three-dimensional image as the target reference image.
If the registration type package is 3D-3D registration, the real-time image guidance system may directly determine the target three-dimensional image registered with the planning image when the positioning is completed as the target reference image in combination with the description of the above embodiment.
Step 503, acquiring a real-time projection image of the target object by using an image guiding device.
This step may be described with reference to step 302 above, and will not be described in detail herein.
Step 504, performing real-time image guidance on the target object according to the target reference image and the real-time projection image.
Alternatively, if the target object includes a marker, step 504 may include:
dividing the markers in the target reference image and the real-time projection image respectively, carrying out first target registration on the markers in the divided target reference image and the markers in the real-time projection image, and determining the target offset of the target object according to the registration result of the first target registration. Further, the position of the target object can be adjusted according to the target offset to realize real-time image guidance of the target object.
Alternatively, if the target object includes bony landmark tissue and markers, step 504 may include:
dividing the markers in the target reference image and the real-time projection image respectively, and carrying out first target registration on the markers in the divided target reference image and the markers in the real-time projection image so as to determine a first reference offset of the target object according to a registration result of the first target registration. And then, performing second target registration on the osseous mark tissues in the target reference image and the osseous mark group fabric in the real-time projection image, and determining a second reference offset of the target object according to a registration result of the second target registration. And finally, calculating the target offset of the target object based on the first offset, the weight value of the first offset, the second offset and the weight value of the second offset. Further, the position of the target object can be adjusted according to the target offset to realize real-time image guidance of the target object.
Alternative implementation manners of the above steps may be described with reference to the above corresponding embodiments, which are not described herein.
It should be noted that, the sequence of the steps of the real-time image guiding method provided in the embodiment of the present disclosure may be appropriately adjusted, and any method that is easily conceivable to be changed by a person skilled in the art within the technical scope of the present disclosure should be covered within the protection scope of the present disclosure, so that no further description is given.
In summary, the embodiments of the present disclosure provide a real-time image guidance method. The image guidance system can reliably guide the real-time image of the target object based on the acquired target reference image and the real-time projection image of the target object. Because the target reference image is acquired after positioning is completed, the accuracy of guiding the real-time image of the target object based on the target reference image and the real-time projection image is higher.
Fig. 10 is a block diagram of a real-time image guidance apparatus according to an embodiment of the present disclosure, which may be applied to the real-time image guidance system 03 shown in fig. 1. As shown in fig. 10, the apparatus may include:
a first acquisition module 601, configured to acquire a target reference image of a target object.
The target reference image may be an image determined based on an image acquired by the image guiding device under the condition that positioning of the target object is completed.
A second acquisition module 602, configured to acquire a real-time projection image of the target object using the image guidance apparatus.
The image guiding module 603 is configured to guide the real-time image of the target object according to the target reference image and the real-time projection image.
Alternatively, as an alternative implementation: the target reference image is a three-dimensional image. Then, the first acquisition module 601 may be configured to:
and under the condition that the positioning of the target object is completed, determining a target three-dimensional image reconstructed by the target two-dimensional projection images of the target object under different shooting angles as a target reference image. The two-dimensional projection images of the target object under different shooting angles are acquired by the image guiding device.
That is, the first acquisition module 601 may be configured to:
and under the condition that the positioning of the target object is completed, acquiring target two-dimensional projection images of the target object under different shooting angles by adopting an image guiding device.
Reconstructing target two-dimensional projection images of the target object under different shooting angles.
And determining the reconstructed target three-dimensional image as a target reference image.
Optionally, as shown in fig. 11, the apparatus may further include:
the fourth obtaining module 604 is configured to obtain, by using the image guidance device, an alternative three-dimensional image of the target object, where the alternative three-dimensional image is an image reconstructed based on a reference two-dimensional projection image of the target object at different shooting angles, before obtaining the target reference image of the target object.
A second image registration module 605 is used for image registration of the alternative three-dimensional image and the planning image.
A second determining module 606 is configured to determine that positioning of the target object is completed when the registration result meets the registration condition.
Accordingly, the first acquisition module 601 may be configured to determine the alternative three-dimensional image as the target reference image.
Alternatively, as an alternative implementation: the image guidance module 603 may be configured to:
and acquiring a target shooting angle of the real-time projection image.
And acquiring a target digital reconstructed radioactive DRR image of the target three-dimensional image under the target shooting angle.
And carrying out real-time image guidance on the target object according to the target DRR image and the real-time projection image.
For example, the image guidance module 603 may be configured to:
and carrying out image registration on the target DRR image and the real-time projection image, determining a target offset of the target object according to a registration result, and adjusting the position of the target object according to the target offset.
Alternatively, the target object may include a marker, and accordingly, the image guidance module 603 may be configured to:
dividing the markers in the target DRR image and the real-time projection image respectively, and registering the markers in the divided target DRR image and the markers in the real-time projection image.
Optionally, the image guidance module 603 may be configured to: and performing image processing on the target three-dimensional image to obtain a target three-dimensional image only comprising the marker, and acquiring a target DRR image of the target three-dimensional image only comprising the marker under a target shooting angle.
Alternatively, as another alternative implementation: the image guidance module 603 may be configured to:
and determining the image scaling factor according to the target DRR image and the real-time projection image.
And determining the target offset of the target object according to the image scaling factor.
And adjusting the position of the target object according to the target offset.
Alternatively, if the target object includes a marker, the image guidance module 603 may be configured to:
dividing the markers in the target DRR image and the real-time projection image respectively, and determining the image scaling factor according to the markers in the divided target DRR image and the markers in the real-time projection image.
Alternatively, as another alternative implementation: if the target reference image is a two-dimensional image. Then, the first acquisition module 601 may be configured to:
and under the condition that the positioning of the target object is completed, determining a target two-dimensional projection image of the target object under different shooting angles as a target reference image. The two-dimensional projection images of the target object under different shooting angles are acquired by the image guiding device.
Optionally, the image guidance module 603 may be configured to: and acquiring a target shooting angle of the real-time projection image. And acquiring a target two-dimensional projection image of the target object under the target shooting angle. And determining the image scaling factor according to the target two-dimensional projection image and the real-time projection image under the target shooting angle. And determining the target offset of the target object according to the image scaling factor. And adjusting the position of the target object according to the target offset.
Alternatively, if the target object includes a marker, the image guidance module 603 may be configured to:
dividing the target two-dimensional projection image and the marker in the real-time projection image under the target shooting angle respectively, and determining the image scaling factor according to the divided marker in the target two-dimensional projection image and the marker in the real-time projection image under the target shooting angle.
Alternatively, the markers may comprise metallic markers, and the metallic markers may be affixed to the patient's body surface or implanted into the patient during the positioning phase.
Alternatively, the target object may include: at least three non-collinear markers.
Alternatively, as yet another alternative implementation: the first acquisition module 601 may be configured to:
And under the condition that the target object is positioned, acquiring a positioning registration type of the target object, wherein the positioning registration type is the type of image registration in the positioning stage, and determining a target reference image based on the image acquired by the image guiding device according to the positioning registration type.
Alternatively, the registration types may include 2D-2D registration and 3D-3D registration.
Wherein, if the positioning registration type includes 2D-2D registration, the first obtaining module 601 may be configured to: and acquiring target two-dimensional projection images of the target object under different shooting angles by adopting an image guiding device, reconstructing the target two-dimensional projection images of the target object under different shooting angles, and determining the reconstructed target three-dimensional image as a target reference image.
If the localization registration type includes 3D-3D registration, the first acquisition module 601 may be configured to: and determining the target three-dimensional image registered with the planning image when the positioning is completed as a target reference image.
In summary, the embodiments of the present disclosure provide a real-time image guidance device. The device can conduct reliable real-time image guidance on the target object based on the acquired target reference image and the real-time projection image of the target object. Because the target reference image is acquired after positioning is completed, the accuracy of guiding the real-time image of the target object based on the target reference image and the real-time projection image is higher.
With respect to the real-time image guidance apparatus in the above-described embodiment, the specific manner in which the respective modules perform the operations has been described in detail in the embodiment regarding the method, and will not be described in detail herein.
Alternatively, in conjunction with fig. 1 and 13, the real-time image guidance system 30 in a radiation therapy system may include: an image guidance device 031, a processor 032 and a memory 033. Wherein the image-guiding device may be used to acquire images. The memory may have instructions stored therein. The instructions are loaded and executed by the processor to implement a real-time image guidance method as shown in any one of fig. 3, 8 and 9.
Optionally, the embodiment of the present disclosure further provides a storage medium, where instructions may be stored, and when the storage medium is executed on the processing component, the processing component may be caused to perform the real-time image guidance method as shown in any one of fig. 3, 8, and 9.
The foregoing description of the preferred embodiments of the present disclosure is provided for the purpose of illustration only, and is not intended to limit the disclosure to the particular embodiments disclosed, but on the contrary, the intention is to cover all modifications, equivalents, alternatives, and alternatives falling within the spirit and principles of the embodiments of the present disclosure.

Claims (23)

  1. A method of real-time image guidance, the method comprising:
    acquiring a target reference image of a target object, wherein the target reference image is an image determined based on an image acquired by an image guiding device under the condition that the positioning of the target object is completed;
    acquiring a real-time projection image of the target object by adopting the image guiding device;
    and guiding the target object in real time according to the target reference image and the real-time projection image.
  2. The method of claim 1, wherein the acquiring the target reference image of the target object comprises:
    under the condition that the target object is positioned, determining a target three-dimensional image reconstructed by the target two-dimensional projection images of the target object under different shooting angles as the target reference image;
    the target two-dimensional projection images of the target object under different shooting angles are acquired by the image guiding device.
  3. The method according to claim 2, wherein determining the target three-dimensional image reconstructed from the target two-dimensional projection image of the target object at different shooting angles as the target reference image when the positioning of the target object is completed comprises:
    Under the condition that the positioning of the target object is completed, acquiring target two-dimensional projection images of the target object under different shooting angles by adopting the image guiding device;
    reconstructing target two-dimensional projection images of the target object under different shooting angles;
    and determining the reconstructed target three-dimensional image as the target reference image.
  4. The method of claim 2, wherein prior to acquiring the target reference image of the target object, the method further comprises:
    acquiring an alternative three-dimensional image of the target object by adopting the image guiding device, wherein the alternative three-dimensional image is an image reconstructed based on reference two-dimensional projection images of the target object under different shooting angles;
    performing image registration on the alternative three-dimensional image and the planning image;
    when the registration result is determined to meet the registration condition, finishing the positioning of the target object;
    correspondingly, under the condition that the positioning of the target object is completed, determining the target three-dimensional image reconstructed by the target two-dimensional projection image of the target object under different shooting angles as the target reference image comprises the following steps:
    the alternative three-dimensional image is determined as the target reference image.
  5. The method of claim 2, wherein the real-time image-guiding of the target object from the target reference image and the real-time projection image comprises:
    acquiring a target shooting angle of the real-time projection image;
    acquiring a target digital reconstruction radiation DRR image of the target three-dimensional image under the target shooting angle;
    and carrying out real-time image guidance on the target object according to the target DRR image and the real-time projection image.
  6. The method of claim 5, wherein said real-time image-guiding of said target object from said target DRR image and said real-time projection image comprises:
    performing image registration on the target DRR image and the real-time projection image;
    determining a target offset of the target object according to the registration result;
    and adjusting the position of the target object according to the target offset.
  7. The method of claim 6, wherein the target object comprises a marker, and wherein the image registering the target DRR image and the real-time projection image, respectively, comprises:
    dividing the markers in the target DRR image and the real-time projection image respectively;
    And carrying out image registration on the markers in the segmented target DRR image and the markers in the real-time projection image.
  8. The method of claim 7, wherein the acquiring the target digital reconstructed radiodrr image of the target three-dimensional image at the target capture angle comprises:
    performing image processing on the target three-dimensional image to obtain a target three-dimensional image only comprising the marker;
    and acquiring the target DRR image of the target three-dimensional image only comprising the marker under the target shooting angle.
  9. The method of claim 5, wherein said real-time image-guiding of said target object from said target DRR image and said real-time projection image comprises:
    determining an image scaling factor according to the target DRR image and the real-time projection image;
    determining a target offset of the target object according to the image scaling factor;
    and adjusting the position of the target object according to the target offset.
  10. The method of claim 9, wherein the target object comprises a marker, and wherein the determining an image zoom magnification from the target DRR image and the real-time projection image, respectively, comprises:
    Dividing the markers in the target DRR image and the real-time projection image respectively;
    and determining the image scaling factor according to the segmented markers in the target DRR image and the markers in the real-time projection image.
  11. The method of claim 1, wherein the acquiring the target reference image of the target object comprises:
    under the condition that the target object is positioned, determining a target two-dimensional projection image of the target object under different shooting angles as the target reference image;
    the target two-dimensional projection images of the target object under different shooting angles are acquired by the image guiding device.
  12. The method of claim 11, wherein the real-time image-guiding of the target object from the target reference image and the real-time projection image comprises:
    acquiring a target shooting angle of the real-time projection image;
    acquiring a target two-dimensional projection image of the target object under the target shooting angle;
    determining an image scaling factor according to the target two-dimensional projection image and the real-time projection image under the target shooting angle;
    Determining a target offset of the target object according to the image scaling factor;
    and adjusting the position of the target object according to the target offset.
  13. The method of claim 12, wherein the target object comprises a marker, and wherein the determining the image zoom magnification from the target two-dimensional projection image and the real-time projection image at the target capture angle, respectively, comprises:
    dividing the target two-dimensional projection image and the marker in the real-time projection image under the target shooting angle respectively;
    and determining the image scaling factor according to the markers in the two-dimensional projection image of the target and the markers in the real-time projection image under the segmented shooting angle of the target.
  14. The method of claim 7 or 10, wherein the markers comprise metallic markers that are affixed to the body surface of the patient or implanted in the patient during the positioning phase.
  15. The method of claim 14, wherein the target object comprises: at least three non-collinear markers.
  16. The method of claim 1, wherein the acquiring the target reference image of the target object comprises:
    Under the condition that the target object is positioned, acquiring a positioning registration type of the target object, wherein the positioning registration type is an image registration type in a positioning stage;
    and determining a target reference image based on the image acquired by the image guiding device according to the positioning registration type.
  17. The method of claim 16, wherein the type of registration comprises 2D-2D registration and 3D-3D registration.
  18. The method of claim 17, wherein if the type of registration comprises the 2D-2D registration, the acquiring the target reference image of the target object comprises:
    acquiring target two-dimensional projection images of the target object under different shooting angles by adopting the image guiding device;
    reconstructing target two-dimensional projection images of the target object under different shooting angles;
    and determining the reconstructed target three-dimensional image as the target reference image.
  19. The method of claim 17, wherein if the type of registration comprises 3D-3D registration, the acquiring the target reference image of the target object comprises:
    and determining the target three-dimensional image registered with the planning image when the positioning is completed as the target reference image.
  20. A real-time image guidance system, the real-time image guidance system comprising: an image guidance device, a processor, and a memory;
    wherein the image guidance device is used for capturing images, and the memory stores instructions that are loaded and executed by the processor to implement the real-time image guidance method according to any one of claims 1 to 19.
  21. The system of claim 20, wherein the image directing device is a cone-beam electron scanning CBCT device of interest.
  22. A storage medium having instructions stored therein which, when executed on a processing component, cause the processing component to perform the real-time image guidance method of any one of claims 1 to 19.
  23. A radiation therapy system, the radiation therapy system comprising: a patient support device, a mainframe, and a real-time image guidance system; the real-time image guidance system is the system of claim 20;
    the host is respectively connected with the real-time image guiding system and the patient support device, the real-time image guiding system is used for sending the determined target offset of the target object to the host, and the host is used for adjusting the position of the patient support device based on the target offset.
CN202080107396.3A 2020-12-10 2020-12-10 Real-time image guiding method, device and system and radiotherapy system Pending CN116490897A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/135261 WO2022120716A1 (en) 2020-12-10 2020-12-10 Real-time image guided method, apparatus and system, and radiotherapy system

Publications (1)

Publication Number Publication Date
CN116490897A true CN116490897A (en) 2023-07-25

Family

ID=81972997

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080107396.3A Pending CN116490897A (en) 2020-12-10 2020-12-10 Real-time image guiding method, device and system and radiotherapy system

Country Status (2)

Country Link
CN (1) CN116490897A (en)
WO (1) WO2022120716A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9314160B2 (en) * 2011-12-01 2016-04-19 Varian Medical Systems, Inc. Systems and methods for real-time target validation for image-guided radiation therapy
CN108635681B (en) * 2018-03-21 2020-11-10 西安大医集团股份有限公司 Positioning method and device, upper computer and radiotherapy system
WO2020006681A1 (en) * 2018-07-03 2020-01-09 西安大医集团有限公司 Mark data obtaining method and apparatus, training method and apparatus, and medical device
WO2020014934A1 (en) * 2018-07-19 2020-01-23 西安大医集团有限公司 Tumor positioning method and device
WO2020087257A1 (en) * 2018-10-30 2020-05-07 西安大医集团有限公司 Image guidance method and device, and medical equipment and computer readable storage medium
CN110227214B (en) * 2019-07-12 2021-11-30 江苏瑞尔医疗科技有限公司 Radiotherapy positioning method based on positioning target

Also Published As

Publication number Publication date
WO2022120716A1 (en) 2022-06-16

Similar Documents

Publication Publication Date Title
US7453983B2 (en) Radiation therapy method with target detection
US8396248B2 (en) Sequential stereo imaging for estimating trajectory and monitoring target position
JP6181459B2 (en) Radiation therapy system
US9125570B2 (en) Real-time tomosynthesis guidance for radiation therapy
US11756242B2 (en) System and method for artifact reduction in an image
AU2015218552A1 (en) Interventional imaging
Dang et al. Robust methods for automatic image‐to‐world registration in cone‐beam CT interventional guidance
CN116056757A (en) Multisensor guided radiation therapy
KR101027099B1 (en) Data mesuring method for position compensation of computed tomography
CN113891740B (en) Image guidance method and device, medical equipment and computer readable storage medium
US20220054862A1 (en) Medical image processing device, storage medium, medical device, and treatment system
CN109985315B (en) Nuclear magnetic image guide radiotherapy equipment and storage medium
CN116490897A (en) Real-time image guiding method, device and system and radiotherapy system
WO2022120707A1 (en) Real-time image guiding method, apparatus and system, and radiation therapy system
WO2022120714A9 (en) Image segmentation method and apparatus, image guidance system, and radiotherapy system
CN116407780A (en) Target area position monitoring method, system and storage medium
JP2012030005A (en) Radiotherapy equipment controller and radiotherapy equipment control method
CN117101022A (en) Image guiding method, device, medium and equipment for radiotherapy

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination