JP4827981B2 - Method and system for positioning an object from a first posture to a second posture - Google Patents

Method and system for positioning an object from a first posture to a second posture Download PDF

Info

Publication number
JP4827981B2
JP4827981B2 JP2010121174A JP2010121174A JP4827981B2 JP 4827981 B2 JP4827981 B2 JP 4827981B2 JP 2010121174 A JP2010121174 A JP 2010121174A JP 2010121174 A JP2010121174 A JP 2010121174A JP 4827981 B2 JP4827981 B2 JP 4827981B2
Authority
JP
Japan
Prior art keywords
image
posture
display device
object
change
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2010121174A
Other languages
Japanese (ja)
Other versions
JP2011011051A (en
JP2011011051A5 (en
Inventor
アンドレア・イー・ジー・ブラッドショー
ジョン・シー・バーンウェル・ザ・サード
ユリ・エイ・イバノブ
Original Assignee
ミツビシ・エレクトリック・リサーチ・ラボラトリーズ・インコーポレイテッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US12/495,744 priority Critical patent/US7934869B2/en
Priority to US12/495,744 priority
Application filed by ミツビシ・エレクトリック・リサーチ・ラボラトリーズ・インコーポレイテッド filed Critical ミツビシ・エレクトリック・リサーチ・ラボラトリーズ・インコーポレイテッド
Publication of JP2011011051A publication Critical patent/JP2011011051A/en
Publication of JP2011011051A5 publication Critical patent/JP2011011051A5/ja
Application granted granted Critical
Publication of JP4827981B2 publication Critical patent/JP4827981B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/1048Monitoring, verifying, controlling systems and methods
    • A61N5/1049Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/1048Monitoring, verifying, controlling systems and methods
    • A61N5/1049Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
    • A61N2005/1061Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam using an x-ray imaging system having a separate imaging source
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/1048Monitoring, verifying, controlling systems and methods
    • A61N5/1064Monitoring, verifying, controlling systems and methods for adjusting radiation treatment in response to monitoring
    • A61N5/1069Target adjustment, e.g. moving the patient support

Description

  The present invention relates generally to a method for positioning an object, and more particularly to a method for positioning an object based on a registered image of the object.

Radiation therapy Radiation therapy directs high-energy ionizing radiation to a diseased tissue, such as a tumor, in a patient's body while not harming healthy tissue. One form of radiation therapy is particle beam therapy. In particle beam therapy, the maximum irradiation depth can be controlled. On the other hand, there is a need to accurately determine the location of the tumor, particularly in the vicinity of important organs such as the brain, liver, lungs, stomach, and heart. Therefore, it is desirable to position the entire tissue structure of the patient according to the treatment plan.

  Radiation therapy directs high-energy ionizing radiation to the diseased tissue in the human body without causing harm to healthy tissue. One form of radiation therapy is particle beam therapy. In particle beam therapy, the maximum irradiation depth can be controlled. On the other hand, unlike conventional photon-based radiation therapy, the dose peak is located in the tissue, and the precise depth position of the peak dose is determined by the energy of the particle beam and the tissue in the path of the particle beam. Therefore, it is necessary to accurately determine the location of the target organization. Accordingly, it is desirable to accurately position the entire tissue structure of the patient according to the geometric alignment of the treatment beam specified in the treatment plan.

  Radiation therapy uses ionizing radiation as part of cancer treatment to suppress malignant cells. Radiation therapy can be used for curative cancer treatment or adjunct cancer treatment. Radiation therapy cannot be cured and is used as a pain relief treatment if it is intended to suppress local disease or reduce symptoms, or if the therapy has a survival benefit and can be cured, Used as a curative treatment. Radiation therapy is used to treat malignant tumors and can be used as a primary therapy. It is also common to combine radiation therapy with surgery, chemotherapy, hormonal therapy, or a combination thereof.

  In oncological cases, radiation therapy is generally applied primarily to tumors. If nearby lymph nodes are clinically associated with the tumor or are considered at risk of metastasis, the field may also include those lymph nodes. In order to take into account patient placement uncertainty and internal tumor movement, it is necessary to include extra normal tissue around the tumor.

  It should be noted that radiation therapy is usually provided in multiple short sessions over several weeks, eg 3 or 4 weeks, allowing the patient to recover between treatments. For this reason, it is difficult to achieve the same arrangement. Thus, to indicate to the radiation therapist how to position the patient relative to the beam, the patient's skin is typically marked with an indelible ink during the treatment plan.

  Placement uncertainty can also be caused by internal movements, such as breathing and bladder filling, and movement of external skin marks relative to the tumor location.

  In order to prevent harm to normal tissues such as skin or organs through which radiation must pass to treat the tumor, the shaped radiation beam intersects the tumor from several angles of irradiation. It is aimed and provides a much larger absorbed dose in the tumor than the surrounding healthy tissue. Typically, the radiation source is placed on a gantry that rotates around the patient. The goal is to place the tumor at the isocenter of the central axis of the beam as it rotates so that the beam always passes through the tumor and is much less frequently passed through healthy tissue.

Patient Positioning A common problem in radiation therapy is positioning the patient relative to the radiation equipment according to a treatment plan. Treatment plans are typically created using high-resolution computed tomography (CT) scans and include three-dimensional (3D) volume data representing tissue density. During treatment, the patient needs to be positioned relative to the radiology device to ensure that the tumor is positioned at the isocenter of the central axis of the radiation beam, and thus the planned radiation dose is delivered to the tumor. .

  To achieve this goal, typically a set of x-rays is acquired and compared to a predicted view of the CT volume. Positioning errors are estimated and the patient is moved to the correct position. Currently, this positioning is performed manually or semi-automatically.

  The manual method is time consuming and requires an understanding of the 3D shape of the object that is manipulated over 6 degrees of freedom (6-DOF). The therapist moves the couch with the patient and acquires an X-ray image every time after moving. This procedure is time consuming and can irradiate the patient with large doses of ineffective X-ray radiation.

  Automatic methods may position the patient incorrectly. Therefore, the radiotherapist needs to examine the result of the automatic positioning. In addition, it is necessary for the therapist to manipulate the rendered 3D volume to set the initial state or mark point correspondence using a conventional input device such as a mouse or trackball. It is difficult and unintuitive for the following reasons.

  The CT scan data forms a 3D volume that needs to be manipulated using 6-DOF. Such an operation using a 2D manipulator (mouse) and 2D images is difficult.

  When using a conventional user interface, the mouse and the display device are not in the same location, which makes it difficult to identify the correspondence.

  An object of the present invention is to provide a system and method for positioning an object based on an image of the object.

  One embodiment of the present invention describes a method for positioning an object from a first posture to a second posture. The method simultaneously displays a first image and a second image on a display device, the first image representing a first attitude of the object and the second image representing a second attitude of the object. The method applies the second image in response to a change in the orientation of the display device such that the alignment of the first image and the second image on the display device depends on a change in the orientation of the display device. Updating, and positioning the object from the first posture to the second posture based on the change in the posture of the display device.

  Another embodiment of the invention is a system for positioning an object from a first posture to a second posture, wherein the display is configured to simultaneously display the first image and the second image. A display device, a first image and a second image on the display device, wherein the first image represents a first posture of the object and the second image represents a second posture of the object A rendering engine configured to update the second image in response to a change in the orientation of the display device, and a change in the orientation of the display device such that the alignment of the display device depends on a change in the orientation of the display device And a positioning module configured to position an object from a first posture to a second posture based on the system.

1 is a block diagram of a method and system for positioning an object according to an embodiment of the invention. FIG. FIG. 3 is a block diagram of a method for displaying and updating a current data model according to an embodiment of the present invention. FIG. 3 is a block diagram of a method for coordinate transformation according to an embodiment of the present invention. FIG. 3 is a block diagram of a method for comparison of data models according to an embodiment of the present invention. FIG. 3 is a schematic diagram of one of the horizontal manipulators according to an embodiment of the present invention. FIG. 3 is a schematic diagram of one of the horizontal manipulators according to an embodiment of the present invention. FIG. 3 is a schematic diagram of one of the horizontal manipulators according to an embodiment of the present invention. 1 is a schematic diagram of one vertical manipulator according to an embodiment of the present invention.

  FIG. 1 illustrates a method and system 100 for positioning an object from a first posture 111 to a second posture 121 according to an embodiment of the present invention. The method positions an object based on the alignment of the first image 110 and the second image 120 on the display device 130. The first image represents the first posture of the object, the second image represents the second posture of the object, and the first image and the second image are simultaneously displayed on the display device. Method 100 is performed by processor 101. The processor comprises memory for various models, input / output interfaces, and a graphics processing unit (GPU) known in the art.

  The first image is rendered from the image 111 acquired from the object positioned in the first posture (112) (151). A second image is rendered (150) from the volume image 123 obtained from an object positioned in a second orientation (122). In some embodiments, the images are compared (410) as described below.

  In some embodiments, the object is a patient 182 positioned for treatment. In one embodiment, the volume image is constructed from detailed computed tomography (CT) scans acquired from the patient during the treatment plan. In other embodiments, volume images are acquired by magnetic resonance imaging (MRI) volume and positron emission tomography (PET). The first image is constructed from x-ray images acquired from the patient during placement for a treatment session. It will be understood that the object may be any arbitrary object and the patient is just one example object.

  The pose of the object has 3D location and 3D orientation, resulting in 6 degrees of freedom (6-DOF). Images are aligned by changing the orientation of the display device. Typically, the operator moves the display device with 6-DOF while aligning the second image with the first image. After the images are aligned, eg, superimposed on each other, the change in attitude of the display device indicates a transformation, eg, transformation parameter 155, for positioning (180) the object from the first attitude to the second attitude. .

  The idea behind the present invention is a state-of-the-art solution similar to classic children's games. In this game, the tray has a target hole and marbles. Use your hands to skillfully manipulate the marble from the current position into the target hole. However, the present invention does this in a non-intuitive way. In the present invention, the marble is fixed at the current position of the fixed position, and instead, the target hole is skillfully manipulated to align with the marble. Also, in the present invention, the tray is replaced with a weightless display, the holes and marbles are translated and rotated images, optical encoder measurements, normalized cross-correlation, Kullback-Leibler divergence, and others Applying such a functional transformation and distance metric, the particle beam is naturally aligned with the patient using two hands.

  Usually, the image 113 is two-dimensional (2D). For this reason, the rendering engine 151 transmits the image for display without substantially changing the image. However, the volume image 123 is three-dimensional (3D) and is converted to 2D by the rendering engine 150. The rendering engine 150 renders the second image from the volume image based on the change 154 of the display device orientation obtained by the orientation measurement module 160 in accordance with the movement of the display device. Since the second image is updated according to a change in the posture of the display device, the alignment of the first image and the second image on the display device depends on the change in the posture of the display device.

  FIG. 2 illustrates a method for rendering a second image on a display device. The method determines (210) a view parameter 230 that represents an object having a posture 255 resulting from a change in the second posture 254 (210) based on a change in the posture of the display device 154, eg, translation and rotation. The change in the second posture corresponds to the change in the posture of the display device according to the embodiment of the present invention (250). A graphics processing unit (GPU) 240 renders the second image from the volume image 123 according to the view parameters.

  For example, in one embodiment, the second image 120 is a composite X-ray view, ie, a digitally reconstructed radiographic (DRR) image as if acquired by an X-ray imaging device from an object having a posture 255. .

  FIG. 3 shows a coordinate transformation 300 performed by the rendering engine 150. The volume image 123 has an associated global coordinate system 320. The view parameter 230 is associated with the image coordinate system 330 of the display device. If the internal and external parameters of the camera, for example an X-ray imaging device, are known, for example the focal length, the location of the principal point in the image, the only parameter that needs to be recovered is the change in the orientation 154 of the display device Sought by.

  The display device is arranged rigidly on a manipulator 170 having 6-DOF. Therefore, the display device can be moved with 6-DOF. A change in the posture of the manipulator and indirectly a change in the posture of the display device is detected by the posture measurement module 160. The change in manipulator posture is described by a translation parameter 155. In one embodiment, after the first and second images are aligned, the patient 182 is positioned for radiation therapy based on the translation parameter (180).

Image Comparison FIG. 4 shows the comparison module. The comparison module compares the first image and the second image (410) and creates a comparison result 420 based on some similarity metric, eg, Euclidean distance, Kullback-Leibler (KL) divergence, normalized cross-correlation. To do.

  The first image and the second image may be misaligned when initially displayed. The purpose of the alignment process is to align the images with each other by moving the display device. The movement of the display device is usually performed by an operator. In order to facilitate alignment, some embodiments render the comparison results, eg, the direction and / or size of the deviation on the display device. In one embodiment, the deviation is highlighted on the display device. In another embodiment, the value of the comparison result is displayed. In yet another embodiment, an expandable bar indicating the misalignment is displayed beside the display device.

N-DOF
During the positioning of the object, the 6-DOF corresponds to, for example, three rotations and three translations, i.e. rigid transformations. However, in some embodiments, a volume image 123 is acquired several weeks from the image 113, and the image 113 reflects non-rigid deformation. For example, soft tissue alignment may require some fine local stretching and pushing of the tissue, the intestines may have gas, the kidneys may move, and the lungs may be at different levels of swelling. is there. Therefore, in one embodiment, the display device 130 includes a touch-sensitive screen. For this reason, according to the present embodiment, rigid body conversion by moving the display device and non-rigid body correction by operating an image using a touch sensor type screen are possible.

Horizontal Manipulator FIG. 5 shows a display device and a horizontal manipulator 500. The horizontal manipulator has a modified hexapod (Hayward et al. “Kinematic Decoupling In Mechanisms And Application To A Passive Hand Controller. J. Robotics Systems” (Witley, Vol. 10, No. 5, pp. 767). -790,1993)).

  The manipulator 500 enables 6-DOF operation of the display device. The manipulator includes an upper platform 510 and a lower platform 520, each platform having an upper surface 515 and a lower surface 525 on opposite sides. The lower surface of the upper platform is typically aligned and spaced from the upper surface of the lower platform.

  The manipulator further includes a buffer mechanism 530 for providing a weightless buffer for the display device. The cushioning is performed using the system of leg assembly 600 shown in FIG. The leg assembly includes a long “antigravity” spring 610 to compensate for the weight of the display. The leg assembly includes two springs, an expansion compensation spring 620 and a compression compensation spring 630, for re-centering the damping mechanism 530 when the upper platform is released. Two freely rotating ball joints 541 and 542 connect the leg assembly to the platform. In one embodiment, the dampening mechanism comprises six independent leg assemblies that extend between the platforms.

  In one embodiment, the upper surface 515 of the upper platform 510 serves as a display platform that is fixedly connected to the display device 130. In another embodiment, the display platform is the lower surface of the lower platform.

  The buffer mechanism 530 allows 6-DOF positioning of the display platform. The force required to displace the display along any of the degrees of freedom is proportional to the expansion of the leg assembly. See Hayward et al. For further details.

  As shown in FIG. 7, the attitude measurement module comprises a camera 710 connected to one platform and a calibration pattern 720 on the other platform. For example, in one embodiment, the camera is connected to the upper platform and the calibration pattern is connected to the lower platform. Note that the pattern and camera can be reversed. The only problem is that one is fixed and the other moves freely. In another embodiment, the calibration pattern is a grid pattern.

  The posture detector 730 obtains the posture of the camera based on the calibration pattern image captured by the camera. In one embodiment, the attitude detector is running on a camera processor. In another embodiment, the calibration pattern is illuminated from the back by lamps and diffusers commonly used in liquid crystal displays (LCDs), thereby allowing the manipulator 500 to be used in a dim environment, such as a treatment room. It can be so.

  Using the image of the calibration pattern, a plane projective transformation between the global coordinate system and the image coordinate system, that is, a transformation parameter 155 is obtained. An example of determining a transformation parameter using an estimate based on computer vision is described in US Pat. No. 6,527,395, which is incorporated herein by reference.

Vertical Manipulator FIG. 8 shows that the range of motion is increased, has a vertical display, has a direct force mapping to the DOF variable, and the center “attractor”, ie the position that the system returns when no force is applied A vertical manipulator that does not exist is shown.

  The vertical manipulator 800 includes a display platform 810 connected to an operating device for 6-DOF operation. The attitude measurement module includes a rotating optical encoder 820. The vertical in / out displacement is encoded by a rotating wheel with a cable fitting. This movement is perceived as a combination of two rotations and one in / out displacement. Note that there is no lateral linear displacement sensor. From the set of six calibrated rotation measurements, the transformation parameters are calculated by propagating the rotation along the kinematic chain.

  The vertical manipulator includes a 3-DOF gimbal assembly 830 connected to a display platform and a vertically oriented central post 840. The vertical manipulator further includes a swing arm 850 that connects the gimbal assembly to the central column. The oscillating arm rotates about the central column, slides up and down in the vertical direction, and expands or contracts in the horizontal direction.

  In one embodiment, the swing arm is connected to the support 860 and is configured to slide horizontally through the opening 865. The support is then positioned around the central column, allowing the swing arm to slide vertically.

  In one embodiment, the display platform has four rotations and two translations and does not directly correspond to the target three-rotation and three-translation system. In this embodiment, the lateral translation of the display platform is achieved by rotation about the center post 845, outward expansion of the swing arm, and gimbal rotation about the vertical axis in the opposite direction of the rotation 845. Is done.

  In one embodiment, the vertical displacement requires a constant load of about 60 kg. Since the load does not change, due to the separation of the swing arm and the removal of the cantilever effect, only static compensation by static weight is required to balance the swing arm and gimbal assembly.

  Although the invention has been described by way of examples of preferred embodiments, it is to be understood that various other adaptations and modifications can be made within the spirit and scope of the invention. Accordingly, it is the object of the appended claims to cover all modifications and variations that fall within the true spirit and scope of the invention.

Claims (19)

  1. A method for positioning an object from a first posture to a second posture, said method being performed by a computer,
    Simultaneously displaying a first image representing the first posture of the object and a second image representing the second posture of the object on a display device;
    Updating the second image in response to a change in the attitude of the display device such that the alignment of the first image and the second image on the display device depends on a change in the attitude of the display device And steps to
    Positioning the object from the first posture to the second posture based on a change in posture of the display device. A method for positioning the object from the first posture to the second posture.
  2. Rendering the first image from an image obtained from the object having the first orientation;
    The method of claim 1, further comprising: rendering the second image from a volume image acquired from the object having the second pose.
  3. The updating step includes:
    Determining a change in posture of the display device;
    Obtaining a view parameter representing the object having a posture resulting from the change in the second posture corresponding to a change in the posture of the display device; and rendering the second image from the volume image according to the view parameter The method of claim 2, further comprising:
  4. The method of claim 2, wherein the object is a patient positioned for treatment and the first image is rendered from an x-ray image acquired from the patient.
  5. The method of claim 4, wherein the volumetric image is constructed from volumetric data acquired from the patient using computer tomography, magnetic resonance imaging, or positron emission tomography.
  6. The method of claim 1, further comprising: superimposing the second image with the first image on the display device based on a change in attitude of the display device.
  7. The method according to claim 1, further comprising: determining a conversion parameter suitable for positioning the object based on a change in posture of the display device.
  8. The display device includes a touch-sensitive screen,
    The first image and the second image are suitable for being operated on the display device using the touch-sensitive screen;
    The method
    The method according to claim 7, further comprising the step of correcting the conversion parameter based on manipulation of a first image and a second image on the display device.
  9. Comparing the first image and the second image to generate a comparison result;
    The method according to claim 1, further comprising: displaying the comparison result on the display device.
  10. The method according to claim 9, wherein the comparison result indicates a direction of deviation.
  11. The method according to claim 9, wherein the comparison result is a similarity metric.
  12. Placing the display device fixedly on a manipulator having six degrees of freedom, comprising a buffer mechanism configured to provide zero gravity buffer to the display device such that the display device has six degrees of freedom; The method of claim 1 further comprising:
  13. A system for positioning an object from a first posture to a second posture,
    A display device configured to simultaneously display a first image representing the first posture of the object and a second image representing the second posture of the object;
    Updating the second image in response to a change in the attitude of the display device such that the alignment of the first image and the second image on the display device depends on a change in the attitude of the display device A rendering engine configured to:
    A positioning module configured to position the object from the first posture to the second posture based on a change in posture of the display device, the object from the first posture to the second posture A system for positioning.
  14. A touch-sensitive screen, wherein the second image is mounted on the display device such that the second image is suitable for being manipulated on the display device using the touch-sensitive screen. A sensor screen;
    Means for obtaining a conversion parameter suitable for positioning of the object based on the orientation of the display device;
    The system of claim 13, further comprising means for correcting the conversion parameter based on manipulation of the second image using the touch-sensitive screen.
  15. A camera connected to the display device such that a change in the posture of the camera is determined by a change in the posture of the display device;
    A calibration pattern positioned within the focus of the camera;
    The system of claim 13, further comprising an attitude measurement module configured to determine a change in attitude of the camera based on an image of the calibration pattern acquired by the camera.
  16. The system according to claim 13, further comprising a manipulator having six degrees of freedom fixedly connected to the display device.
  17. The system of claim 16, wherein the manipulator is configured to provide weightless cushioning for the display device.
  18. A system for positioning an object from a first posture to a second posture,
    It is configured to be fixedly connected to a display device configured to simultaneously display a first image representing the first posture of the object and a second image representing the second posture of the object. A manipulator having six degrees of freedom;
    Updating the second image in response to a change in attitude of the manipulator such that alignment of the first image and the second image on the display device depends on a change in attitude of the manipulator. A rendering engine configured with
    A positioning module configured to determine a change in posture of the manipulator, the object being suitable for being positioned from the first posture to the second posture based on a change in posture of the manipulator. A system for positioning an object from a first posture to a second posture, comprising: a positioning module.
  19. Means for determining a view parameter representing the object having an attitude resulting from a change in the second attitude corresponding to a change in attitude of the manipulator;
    The system of claim 18, further comprising a graphics processing unit configured to render the second image from a volumetric image according to the view parameters.
JP2010121174A 2009-06-30 2010-05-27 Method and system for positioning an object from a first posture to a second posture Active JP4827981B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US12/495,744 US7934869B2 (en) 2009-06-30 2009-06-30 Positioning an object based on aligned images of the object
US12/495,744 2009-06-30

Publications (3)

Publication Number Publication Date
JP2011011051A JP2011011051A (en) 2011-01-20
JP2011011051A5 JP2011011051A5 (en) 2011-07-07
JP4827981B2 true JP4827981B2 (en) 2011-11-30

Family

ID=43380742

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2010121174A Active JP4827981B2 (en) 2009-06-30 2010-05-27 Method and system for positioning an object from a first posture to a second posture

Country Status (2)

Country Link
US (1) US7934869B2 (en)
JP (1) JP4827981B2 (en)

Families Citing this family (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2004266654B2 (en) 2003-08-12 2011-07-21 Loma Linda University Medical Center Modular patient support system
JP5046928B2 (en) 2004-07-21 2012-10-10 メヴィオン・メディカル・システムズ・インコーポレーテッド Synchrocyclotron and method for generating particle beams
ES2730108T3 (en) 2005-11-18 2019-11-08 Mevion Medical Systems Inc Radiation therapy of charged particles
NL1033178C2 (en) * 2007-01-05 2008-07-11 Scarabee Id B V Baggage drop-off system.
US8933650B2 (en) 2007-11-30 2015-01-13 Mevion Medical Systems, Inc. Matching a resonant frequency of a resonant cavity to a frequency of an input voltage
US8581523B2 (en) 2007-11-30 2013-11-12 Mevion Medical Systems, Inc. Interrupted particle source
AU2009217348B2 (en) 2008-02-22 2014-10-09 Loma Linda University Medical Center Systems and methods for characterizing spatial distortion in 3D imaging systems
US8077328B2 (en) * 2009-07-06 2011-12-13 Gammex, Inc. Variable color incoherent alignment line and cross-hair generator
EP2483710A4 (en) 2009-10-01 2016-04-27 Univ Loma Linda Med Ion induced impact ionization detector and uses thereof
US8088055B2 (en) * 2010-05-24 2012-01-03 Mitsubishi Electric Research Laboratories, Inc. Plan-based medical image registration for radiotherapy
WO2014010073A1 (en) * 2012-07-13 2014-01-16 三菱電機株式会社 X-ray positioning apparatus, x-ray positioning method, and image-of-interest imaging method
DE102012216687A1 (en) * 2012-09-18 2014-03-20 Jan Rimbach Apparatus for testing specimens
EP2901821A2 (en) 2012-09-28 2015-08-05 Mevion Medical Systems, Inc. Magnetic field regenerator
TWI604868B (en) 2012-09-28 2017-11-11 美威高能離子醫療系統公司 Particle accelerator and proton therapy system
US10254739B2 (en) 2012-09-28 2019-04-09 Mevion Medical Systems, Inc. Coil positioning system
EP2901824A2 (en) 2012-09-28 2015-08-05 Mevion Medical Systems, Inc. Magnetic shims to adjust a position of a main coil
CN104813749B (en) 2012-09-28 2019-07-02 梅维昂医疗系统股份有限公司 Control the intensity of the particle beams
EP2900326B1 (en) 2012-09-28 2019-05-01 Mevion Medical Systems, Inc. Controlling particle therapy
US9301384B2 (en) 2012-09-28 2016-03-29 Mevion Medical Systems, Inc. Adjusting energy of a particle beam
WO2014052718A2 (en) 2012-09-28 2014-04-03 Mevion Medical Systems, Inc. Focusing a particle beam
JP6121546B2 (en) 2012-09-28 2017-04-26 メビオン・メディカル・システムズ・インコーポレーテッド Control system for particle accelerator
US8791656B1 (en) 2013-05-31 2014-07-29 Mevion Medical Systems, Inc. Active return system
US9730308B2 (en) 2013-06-12 2017-08-08 Mevion Medical Systems, Inc. Particle accelerator that produces charged particles having variable energies
CN105764567B (en) 2013-09-27 2019-08-09 梅维昂医疗系统股份有限公司 Particle beam scanning
US9962560B2 (en) 2013-12-20 2018-05-08 Mevion Medical Systems, Inc. Collimator and energy degrader
US9661736B2 (en) 2014-02-20 2017-05-23 Mevion Medical Systems, Inc. Scanning system for a particle therapy system
JP6400307B2 (en) * 2014-03-10 2018-10-03 キヤノンメディカルシステムズ株式会社 X-ray diagnostic imaging equipment
KR101485292B1 (en) * 2014-04-07 2015-01-28 재단법인대구경북과학기술원 Robot
KR101403787B1 (en) * 2014-04-07 2014-06-03 재단법인대구경북과학기술원 Medical robot
KR101485291B1 (en) * 2014-04-07 2015-01-21 재단법인대구경북과학기술원 Robot
US9950194B2 (en) 2014-09-09 2018-04-24 Mevion Medical Systems, Inc. Patient positioning system
US9665989B1 (en) * 2015-02-17 2017-05-30 Google Inc. Feature agnostic geometric alignment
US10180207B1 (en) * 2017-07-13 2019-01-15 Danylo Kozub Stand

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998002091A1 (en) * 1996-07-11 1998-01-22 The Board Of Trustees Of The Leland Stanford Junior University High-speed inter-modality image registration via iterative feature matching
FI103761B (en) * 1997-12-12 1999-09-30 Planmeca Oy medical imaging
US6463121B1 (en) * 1999-10-13 2002-10-08 General Electric Company Interactive x-ray position and exposure control using image data as reference information
DE19953177A1 (en) * 1999-11-04 2001-06-21 Brainlab Ag Method to position patient exactly for radiation therapy or surgery; involves comparing positions in landmarks in X-ray image and reconstructed image date, to determine positioning errors
US7187792B2 (en) * 2003-08-29 2007-03-06 Accuray, Inc. Apparatus and method for determining measure of similarity between images
JP3859683B2 (en) * 2005-10-17 2006-12-20 株式会社日立製作所 Bed positioning apparatus, positioning method therefor, and particle beam therapy apparatus

Also Published As

Publication number Publication date
JP2011011051A (en) 2011-01-20
US20100329432A1 (en) 2010-12-30
US7934869B2 (en) 2011-05-03

Similar Documents

Publication Publication Date Title
US8131344B2 (en) Method and system for registering a medical situation associated with a first coordinate system, in a second coordinate system using an MPS system
US7199382B2 (en) Patient alignment system with external measurement and object coordination for radiation therapy system
US7831073B2 (en) Precision registration of X-ray images to cone-beam CT scan for image-guided radiation treatment
CN101478918B (en) Parallel stereovision geometry in image-guided radiosurgery
US8981324B2 (en) Patient alignment system with external measurement and object coordination for radiation therapy system
JP5061106B2 (en) Imaging geometry
US8223920B2 (en) Patient positioning imaging device and method
EP1446989B1 (en) Device for aligning a patient for delivering radiotherapy
EP1556128B1 (en) An imaging device for radiation treatment applications
US7894649B2 (en) Target tracking using direct target registration
Cleary et al. Interventional robotic systems: applications and technology state‐of‐the‐art
US8180432B2 (en) Correlation model selection for internal target movement
JP3053389B1 (en) Moving body tracking irradiation device
US20090054772A1 (en) Focused Ultrasound Therapy System
US8295435B2 (en) Cardiac target tracking
US20140152310A1 (en) Gantry for mobilizing an mri device
US7552490B2 (en) Method and apparatus for patient loading and unloading
Yu et al. An anthropomorphic phantom study of the accuracy of Cyberknife spinal radiosurgery
US7166852B2 (en) Treatment target positioning system
Shimizu et al. Use of an implanted marker and real-time tracking of the marker for the positioning of prostate and bladder cancers
US20080037843A1 (en) Image segmentation for DRR generation and image registration
JP5133904B2 (en) Adaptive X-ray control
RU2640566C2 (en) Personal and automatic correction of x-ray system based on optical detection and interpretation of three-dimensional scene
US7623623B2 (en) Non-collocated imaging and treatment in image-guided radiation treatment systems
Ozhasoglu et al. Synchrony–cyberknife respiratory compensation technology

Legal Events

Date Code Title Description
A871 Explanation of circumstances concerning accelerated examination

Free format text: JAPANESE INTERMEDIATE CODE: A871

Effective date: 20110519

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20110519

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110519

A975 Report on accelerated examination

Free format text: JAPANESE INTERMEDIATE CODE: A971005

Effective date: 20110609

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110621

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110714

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20110816

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

R150 Certificate of patent or registration of utility model

Ref document number: 4827981

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140922

Year of fee payment: 3

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20110913

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250