WO2018212231A1 - Image processing device, image processing system, and image processing method - Google Patents

Image processing device, image processing system, and image processing method Download PDF

Info

Publication number
WO2018212231A1
WO2018212231A1 PCT/JP2018/018902 JP2018018902W WO2018212231A1 WO 2018212231 A1 WO2018212231 A1 WO 2018212231A1 JP 2018018902 W JP2018018902 W JP 2018018902W WO 2018212231 A1 WO2018212231 A1 WO 2018212231A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
image processing
heart
processing apparatus
target
Prior art date
Application number
PCT/JP2018/018902
Other languages
French (fr)
Japanese (ja)
Inventor
康之 本間
Original Assignee
テルモ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by テルモ株式会社 filed Critical テルモ株式会社
Priority to JP2019518836A priority Critical patent/JPWO2018212231A1/en
Publication of WO2018212231A1 publication Critical patent/WO2018212231A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography

Definitions

  • the present disclosure relates to an image processing apparatus, an image processing system, and an image processing method.
  • Patent Document 1 describes that a diagnosis is made by estimating a portion of the heart wall motion decreased from an ultrasonic image or the like as an abnormal portion.
  • an object of the present disclosure is to provide an image processing apparatus, an image processing system, and an image processing method capable of performing treatment simulation before performing treatment.
  • the image processing apparatus includes a storage unit that stores a three-dimensional image of the heart including an abnormal part of the heart, and a subject to be injected at an arbitrary injection point of the abnormal part. And a control unit that estimates a permeation region through which the administration target permeates.
  • the storage unit stores physical property information of the administration target, and the control unit estimates the penetration region based on the physical property information of the administration target. .
  • the storage unit stores a dose of the administration target
  • the control unit estimates the penetration region based on the administration dose of the administration target.
  • control unit estimates the wall thickness of the heart based on the three-dimensional image, and estimates the penetration region based on the wall thickness.
  • control unit estimates the position of a blood vessel in the heart based on the three-dimensional image, and based on the position of the injection point with respect to the position of the blood vessel, Estimate the penetration area.
  • the storage unit stores a plurality of the three-dimensional images captured at predetermined time intervals, and the control unit is configured to change the plurality of three-dimensional images over time. Based on this, the penetration area is estimated.
  • the storage unit stores shape information of an injection member that injects the administration target, and the control unit performs the penetration based on the shape information of the injection member. Estimate the region.
  • An image processing system includes a first imaging device that captures a first tomographic image of the heart from outside the body, a second imaging device that captures a second tomographic image of the heart from outside the body, and image processing.
  • An image input device that receives input of the first tomographic image and the second tomographic image, and an abnormality of the heart based on the first tomographic image and the second tomographic image.
  • a control unit that generates a three-dimensional image of the heart including a site and estimates a permeation region through which the administration material penetrates when the administration material is injected at an arbitrary injection point of the abnormal site.
  • An image processing method is an image processing method executed using an image processing apparatus, and stores a three-dimensional image of the heart including an abnormal part of the heart. And an infiltration region estimation step of estimating an infiltration region into which the administration object penetrates when the administration object is injected into an arbitrary injection point of the abnormal site.
  • An image processing apparatus includes a storage unit that stores a three-dimensional image of the heart including an abnormal part of the heart, a display unit that can display the three-dimensional image, and a three-dimensional image.
  • a control unit that determines positions of a plurality of target injection points for injecting the administration target into the abnormal site based on the three-dimensional image and displays the target injection points on the display unit. .
  • control unit determines the order of the plurality of target injection points, and causes the display unit to display the plurality of target injection points in a manner based on the order.
  • control unit estimates a moving path along which the distal end portion of the injection member for injecting the administration object moves via the plurality of target injection points, The order is determined based on the movement route.
  • control unit estimates an infiltration region into which the administration object penetrates when the administration object is injected into a predetermined injection point of the abnormal site, Based on the permeation region, the positions of the plurality of target injection points are determined.
  • An image processing system includes a first imaging device that captures a first tomographic image of the heart from outside the body, a second imaging device that captures a second tomographic image of the heart from outside the body, and image processing.
  • the image processing device based on the image input unit that receives input of the first tomographic image and the second tomographic image, a display unit, the first tomographic image, and the second tomographic image.
  • Generating a three-dimensional image of the heart including an abnormal part of the heart, determining positions of a plurality of target injection points for injecting a substance to be administered into the abnormal part based on the three-dimensional image, and the plurality of target injections A control unit that causes the display unit to display a point superimposed on the three-dimensional image.
  • An image processing method is an image processing method executed using an image processing apparatus, and stores a three-dimensional image of the heart including an abnormal part of the heart.
  • a three-dimensional image display step for displaying the three-dimensional image;
  • a target injection point determination step for determining the positions of a plurality of target injection points for injecting the administration target into the abnormal site based on the three-dimensional image;
  • the image processing device According to the image processing device, the image processing system, and the image processing method of the present disclosure, it is possible to perform treatment simulation before performing treatment.
  • FIG. 1 is a block diagram illustrating a schematic configuration of an image processing system including an image processing apparatus as an embodiment of the present invention.
  • 3 is a flowchart illustrating an overview of image processing performed by the image processing apparatus illustrated in FIG. 1. It is a flowchart which shows the detail of the target part specific process which the image processing apparatus shown in FIG. 1 performs. It is a figure explaining the image process accompanying the target part specific process which the image processing apparatus shown in FIG. 1 performs. It is a schematic diagram which shows an example of the osmosis
  • FIG. 1 is a block diagram showing a schematic configuration of an image processing system 1 including an image processing apparatus 10 as an embodiment of the present invention.
  • the image processing system 1 includes an image processing device 10, an ultrasonic image generation device 20 as a first imaging device, a radiation image generation device 30 as a second imaging device, and a heartbeat acquisition device 40.
  • an image processing device 10 an ultrasonic image generation device 20 as a first imaging device
  • a radiation image generation device 30 as a second imaging device
  • a heartbeat acquisition device 40 a heartbeat acquisition device 40.
  • the ultrasound image generation device 20 as the first imaging device is located outside the body of the subject and captures an ultrasound image as a first tomographic image of the heart from outside the subject.
  • the ultrasonic image generation device 20 generates an ultrasonic transmission unit 21 that transmits ultrasonic waves, an ultrasonic reception unit 22 that receives ultrasonic waves, and a first tomographic image based on the ultrasonic waves received by the ultrasonic reception unit 22. And an image forming unit 23 to be formed.
  • the ultrasonic image generation apparatus 20 transmits ultrasonic waves from the ultrasonic transmission unit 21 toward the subject's heart in a state where the ultrasonic transmission unit 21 and the ultrasonic reception unit 22 are in contact with the body surface of the subject.
  • the ultrasonic wave reception unit 22 receives the ultrasonic wave reflected from the heart of the subject.
  • the ultrasonic image generating apparatus 20 obtains a tomographic image along the ultrasonic traveling surface as a first tomographic image by processing the ultrasonic wave received by the ultrasonic wave receiving unit 22 by the image forming unit 23.
  • the ultrasonic image generation apparatus 20 outputs the captured first tomographic image to the image input unit 11 of the image processing apparatus 10.
  • the ultrasonic image generation apparatus 20 changes the position or orientation of the ultrasonic transmission unit 21 and the ultrasonic reception unit 22 and changes the position or orientation of the ultrasonic image generation unit 21 and the ultrasonic reception unit 22 into a first tomographic image based on a plurality of tomographic images captured along different planes. May be generated as That is, the first tomographic image may be a tomographic image captured along one plane, or may be a three-dimensional image generated based on a plurality of tomographic images captured along a plurality of planes.
  • the radiation image generating device 30 as a second imaging device is located outside the body of the subject and captures a radiation image as a second tomographic image of the heart from outside the subject.
  • the radiation image generation apparatus 30 is, for example, a computed tomography (CT) apparatus.
  • CT computed tomography
  • the radiation image generating apparatus 30 includes a radiation emitting unit 31 that emits radiation, a radiation detecting unit 32 that detects radiation, and an image forming unit 33 that forms a second tomographic image based on the radiation detected by the radiation detecting unit 32. .
  • the radiation image generating apparatus 30 includes a radiation emitting unit 31 and a radiation detecting unit 32 at positions facing each other around the subject, and while rotating the radiation emitting unit 31 and the radiation detecting unit 32 around the subject, Radiation such as X-rays is emitted from the radiation emitting unit 31 toward the subject's heart, and the radiation that has passed through the subject's heart is detected by the radiation detection unit 32.
  • the radiological image generation apparatus 30 obtains a radiological image that is a three-dimensional image of the heart as a second tomographic image by processing the radiation detected by the radiation detection unit 32 using the image forming unit 33.
  • the radiation image generation apparatus 30 outputs the captured second tomographic image to the image input unit 11 of the image processing apparatus 10.
  • the second imaging device may be a magnetic resonance image generation (MRI) device instead of the radiation image generation device 30.
  • the magnetic resonance image generation apparatus is located outside the subject's body and captures a magnetic resonance image as a second tomographic image of the heart from outside the subject's body.
  • the magnetic resonance image generating apparatus includes a magnetic field generating unit that generates a magnetic field, a signal receiving unit that receives a nuclear magnetic resonance signal, and a magnetic resonance image that is a three-dimensional image based on the nuclear magnetic resonance signal received by the signal receiving unit.
  • An image forming unit formed as a second tomographic image.
  • a contrast agent is administered to the subject's heart a predetermined time before the second tomographic image is captured by the radiation image generation device 30 or the magnetic resonance image generation device as the second imaging device. Accordingly, the second tomographic image captured by the second imaging device includes a delayed contrast image.
  • the second imaging apparatus is a nuclear medicine inspection apparatus that performs scintigraphy inspection, SPECT (Single Photon Emission Computed Tomography) inspection, PET (Positoron Emission Tomography) inspection, and the like, instead of the radiation image generation apparatus 30 or the magnetic resonance image generation apparatus. There may be.
  • the nuclear medicine inspection apparatus is located outside the body of the subject, and acquires a radioisotope (RI: radioisotope) distribution image as a second tomographic image of the heart from outside the subject.
  • the nuclear medicine examination apparatus obtains the second tomographic image by imaging the distribution of the drug labeled with the radioisotope previously administered to the subject.
  • the heartbeat acquisition device 40 acquires heart beat information of the subject.
  • the heart beat information includes time change information of the heart beat.
  • the heartbeat acquisition device 40 may acquire the first tomographic image or the second tomographic image at the same time as the image and associate it with the image.
  • the heartbeat acquisition device 40 is, for example, an electrocardiogram monitor that measures temporal changes in cardiac action potential via electrodes attached to the subject's chest or limbs and continuously displays an electrocardiogram waveform.
  • the image processing apparatus 10 is located outside the body of the subject and is configured by an information processing apparatus such as a computer.
  • the image processing apparatus 10 includes an image input unit 11, a heartbeat input unit 12, an operation input unit 13, a display unit 14, a storage unit 15, and a control unit 16.
  • the image input unit 11 receives an input of the first image from the ultrasonic image generation device 20 as the first imaging device.
  • the image input unit 11 includes an interface that receives information from the ultrasonic image generation device 20 and the radiation image generation device 30 by, for example, wired communication or wireless communication.
  • the image input unit 11 outputs information on the input image to the control unit 16.
  • the heartbeat input unit 12 receives input of heart beat information from the heartbeat acquisition device 40.
  • the heartbeat input unit 12 includes an interface that receives information from the heartbeat acquisition device 40 by, for example, wired communication or wireless communication.
  • the heartbeat input unit 12 outputs the input heart beat information to the control unit 16.
  • the operation input unit 13 includes, for example, a keyboard, a mouse, or a touch panel.
  • the touch panel may be provided integrally with the display unit 14.
  • the operation input unit 13 outputs the input information to the control unit 16.
  • the display unit 14 displays a first tomographic image, a second tomographic image, and an image generated by the control unit 16 based on these based on a signal from the control unit 16.
  • the display unit 14 includes a display device such as a liquid crystal display or an organic EL display.
  • the storage unit 15 stores various information and programs for causing the control unit 16 to execute a specific function.
  • the storage unit 15 stores, for example, a three-dimensional image of the heart.
  • the three-dimensional image of the heart is the first tomographic image, the second tomographic image, or display information generated by the control unit 16 based on these in a target part specifying process described later.
  • the three-dimensional image of the heart includes an abnormal portion R ′ (see FIGS. 5 and 7) of the heart.
  • the abnormal part R ′ of the heart is, for example, a target part R (see FIG. 4C) specified by the control unit 16 in a target part specifying process described later.
  • the storage unit 15 stores a plurality of three-dimensional images based on a plurality of tomographic images captured at different times.
  • the storage unit 15 stores, for example, the dose and physical property information of the administration target to be injected into the abnormal site R ′ by treatment using an injection member described later.
  • the storage unit 15 stores, for example, shape information of the injection member.
  • the storage unit 15 includes a storage device such as a RAM or a ROM.
  • the control unit 16 controls the operation of each component that constitutes the image processing apparatus 10.
  • the control unit 16 executes a specific function by reading a specific program. Specifically, the control unit 16 generates display information based on the first tomographic image and the second tomographic image.
  • the control unit 16 causes the display unit 14 to display the generated display information.
  • the control unit 16 may output the generated display information to an external display device.
  • the control unit 16 includes a processor, for example.
  • the control unit 16 may generate display information by correcting the first tomographic image based on the second tomographic image.
  • the control unit 16 detects a feature point in the first tomographic image and a feature point in the second tomographic image by pattern recognition or the like, and selects a feature point corresponding to a region including the feature point in the first tomographic image.
  • the display information obtained by correcting the first tomographic image based on the second tomographic image can be generated.
  • the first tomographic image can be corrected with a higher-definition second tomographic image, so that the structure and shape information of the heart can be shown more accurately.
  • FIG. 2 is a flowchart showing an outline of image processing performed by the image processing apparatus 10.
  • the image processing apparatus 10 first performs a target part specifying process (step S10).
  • the image processing apparatus 10 performs penetration region estimation processing (step S20).
  • the image processing apparatus 10 performs target injection point determination processing (step S30).
  • FIG. 3 is a flowchart showing details of the target part specifying process performed by the image processing apparatus 10.
  • FIG. 4 is a diagram for explaining image processing accompanying the target region specifying process performed by the image processing apparatus 10, and is a diagram showing a cross section of the left ventricle LV of the heart.
  • the control unit 16 of the image processing apparatus 10 reads the first tomographic image input via the image input unit 11, and based on the first tomographic image, the low-motion site P of the heart. Is estimated (step S11: low motion site estimation step). Specifically, the image input unit 11 receives an input of a plurality of first tomographic images captured every predetermined time.
  • control part 16 estimates the low exercise
  • the control unit 16 calculates a change rate obtained by measuring the distance between an arbitrary feature point and another adjacent feature point in the first tomographic image in the diastole and the first tomographic image in the systole.
  • the change rate is reflected in the three-dimensional image of the heart.
  • the control unit 16 reflects in the three-dimensional image of the heart so that the region where the change rate is equal to or less than a predetermined threshold and the region where the change rate exceeds the predetermined threshold have different modes (for example, different colors).
  • the control unit 16 estimates that the part of the heart corresponding to the region where the rate of change is equal to or less than a predetermined threshold is the low motion part P.
  • the predetermined threshold value of the change rate is, for example, 12%, but may be changed as appropriate depending on the setting.
  • the control unit 16 reads out the second tomographic image input via the image input unit 11, and estimates the infarct region Q of the heart based on the second tomographic image (step S12). : Infarct site estimation step).
  • the infarct site Q is a site where the myocardium is ischemic and necrotic.
  • the infarct region Q is included in the low-motion region P because the change rate described above is equal to or less than a predetermined threshold value.
  • the control unit 16 estimates the infarct region Q based on the delayed contrast image of the second tomographic image.
  • control unit 16 estimates that the part where the delayed contrast image is reflected is the infarcted part Q.
  • the control unit 16 estimates the infarct site Q based on the radioisotope distribution.
  • control unit 16 estimates that an accumulation defect site where radioisotopes are not accumulated is the infarct site Q.
  • the control unit 16 may execute the infarct site estimation step (step S12) before the above-described low motion site estimation step (step S11).
  • the control part 16 is other than the infarct site Q estimated by the infarct site estimation process (step S12) among the low exercise sites P estimated by the low exercise site estimation process (step S11).
  • the part is specified as the target part R (step S13: target part specifying step).
  • the target site R is a site where the above change rate is equal to or less than a predetermined threshold but is not necrotic, and is a hibernating myocardium or a stunned myocardium.
  • the control unit 16 generates display information in which the identified target region R is superimposed on the first tomographic image or the second tomographic image.
  • the target region R includes the hibernating myocardium and the stunned myocardium, but exists independently of each other.
  • the hibernating myocardium is a chronic ischemic state and the stunned myocardium is an acute ischemic state. Stunned myocardium is caused by overload due to resumption of blood flow. Therefore, the region of stunned myocardium can be specified by causing an overload condition and eliminating it. Thereby, stunned myocardium and hibernating myocardium can be selected.
  • the heart Since the heart repeatedly contracts and dilates as the heart beats, the heart stretches and contracts in the first tomographic image used in the low motion region estimation step (step S11) and the infarct region estimation step (step S12).
  • the stretched state of the heart in the two tomographic images is preferably the same or close. Therefore, the control unit 16 selects a first tomographic image corresponding to the expansion / contraction state of the heart in the second tomographic image from the plurality of first tomographic images, and specifies the target region R using the selected first tomographic image. To do.
  • the expansion / contraction state of the heart in the first tomographic image may be estimated based on position information of the feature point by detecting a feature point from the first tomographic image by pattern recognition or the like.
  • the expansion / contraction state of the heart in the second tomographic image may be estimated based on position information of the feature point by detecting a feature point from the second tomographic image by pattern recognition or the like.
  • the feature points include, for example, the apex AP or the aortic valve AV.
  • the expansion / contraction state of the heart in the first tomographic image and the second tomographic image may be determined based on heart beat information input via the heartbeat input unit 12.
  • the first tomographic image and the second tomographic image are associated with pulsation information of the heart at the time of imaging, and the stretched state of the heart in the first tomographic image and the second tomographic image is Each is determined by the associated beat information.
  • the image processing apparatus 10 can specify the hibernating myocardium or stunned myocardium having a relatively high therapeutic effect as the target region R, it can contribute to the improvement of the therapeutic effect.
  • FIG. 5 is a schematic diagram showing an example of the penetration region S estimated by the penetration region estimation process performed by the image processing apparatus 10.
  • FIG. 5 is a view showing a cross section of the heart wall of the left ventricle LV of the heart, and shows the range of the permeation region S located in the abnormal region R ′.
  • the control unit 16 penetrates the administration target.
  • the permeation region S to be estimated is estimated (permeation region estimation step).
  • the control unit 16 generates display information in which the estimated penetration region S is superimposed on the three-dimensional image.
  • the abnormal part R ′ of the heart is, for example, the target part R specified by the above-described target part specifying process.
  • the administration target is a biological substance such as a cell or a substance such as a biomaterial.
  • the permeation region S is a region after a predetermined time has elapsed within a time when the effect of the administered product is obtained after the administered product is injected.
  • the control unit 16 estimates the position of the blood vessel BV in the heart based on the three-dimensional image, and estimates the penetration region S based on the position of the injection point T with respect to the position of the blood vessel BV. It is considered that the administration substance injected into the abnormal site R ′ is likely to penetrate in the direction of the blood vessel BV near the blood vessel BV due to the influence of blood flow. Therefore, as illustrated in FIG. 5, the control unit 16 estimates that the infiltration region S extends in the direction of the blood vessel BV as the injection point T is closer to the blood vessel BV.
  • the control unit 16 estimates the position of the infarct site Q based on the three-dimensional image, and estimates the penetration region S based on the position of the injection point T with respect to the position of the infarct site Q. It is considered that the administration substance injected into the abnormal site R ′ is less likely to penetrate in the direction of the infarct site Q because the heart activity such as blood flow or pulsation is reduced near the infarct site Q. Therefore, as shown in FIG. 5, the control unit 16 estimates that the infiltration region S is prevented from extending in the direction of the infarct site Q as the injection point T is closer to the infarct site Q.
  • the control unit 16 may estimate the permeation region S based on the administration dose and physical property information stored in the storage unit 15. Specifically, the control unit 16 estimates that the permeation region S increases as the dose of the administration target increases.
  • the control unit 16 may estimate the wall thickness for each part of the heart based on the three-dimensional image, and may estimate the penetration region S based on the wall thickness. Specifically, the control unit 16 estimates that the penetration region S becomes wider along the heart wall as the wall thickness near the injection point T is thinner.
  • the control unit 16 may estimate the permeation region S based on temporal changes of a plurality of three-dimensional images stored in the storage unit 15.
  • the control unit 16 detects temporal changes in the positions of feature points in a plurality of three-dimensional images, and based on the temporal changes in the positions of the feature points, motions due to pulsations and the like for each part of the heart wall are detected. presume. And it estimates that the penetration
  • the control unit 16 may estimate the infiltration region S based on the shape information of the injection member stored in the storage unit 15.
  • the injection member is composed of, for example, a needle-like member, and a side hole for discharging the administration target is formed around the injection member.
  • the shape information of the injection member includes, for example, the outer shape (linear shape, curved shape, spiral shape, etc.), diameter size, side hole position, side hole size, etc. of the injection member.
  • the image processing apparatus 10 can estimate in advance the permeation region S into which the administration target injected at an arbitrary injection point T of the abnormal region R ′ permeates. Can be simulated.
  • FIG. 6 is a flowchart showing details of target injection point determination processing performed by the image processing apparatus 10.
  • FIG. 7 is a schematic diagram illustrating an example of the target injection point U determined by the target injection point determination process performed by the image processing apparatus 10.
  • FIG. 7 is a cross-sectional view of the left ventricle LV of the heart as viewed from the aortic valve AV (see FIG. 4) to the apex AP (see FIG. 4).
  • the control unit 16 reads out the three-dimensional image stored in the storage unit 15 and displays it on the display unit 14 (step S31: three-dimensional image display step).
  • the control unit 16 determines the positions of a plurality of target injection points U where the administration target should be injected into the abnormal site R ′ (step S32: target injection point determination step).
  • the control unit 16 causes the display unit 14 to display the determined plurality of target injection points U superimposed on the three-dimensional image (step S33: target injection point display step).
  • the position of the target injection point U includes information on the depth along the wall thickness direction from the inner surface of the heart wall. In other words, the target injection point U indicates at what position from the inner surface of the heart wall and how deep the administration target should be injected.
  • the position of the target injection point U is determined based on, for example, the penetration region S estimated by the above-described penetration region estimation process.
  • the control unit 16 estimates the permeation areas S for a plurality of injection points T, and based on the estimated plurality of permeation areas S, determines the injection point T where the administration target should be injected as the target injection point. U is determined. For example, the control unit 16 specifies the injection point T corresponding to the permeation region S included in the other plurality of permeation regions S. Then, the injection point T other than the specified injection point T is determined as the target injection point U. Accordingly, by injecting the administration target into the target injection point U, the permeation region S due to the administration injected into the target injection point U fills the abnormal site R ′ more efficiently.
  • the control unit 16 determines the order of the plurality of target injection points U.
  • the control unit 16 causes the display unit 14 to display a plurality of target injection points U in a manner based on the determined order. For example, as illustrated in FIG. 7, the control unit 16 causes the determined order to be written together with the target injection point U. For example, the control unit 16 displays only the target injection point U in the next order.
  • the control unit 16 estimates the movement path V along which the tip of the injection member that injects the administration object moves via the plurality of target injection points U, and determines the order of the target injection points U based on the movement path V. To do. For example, the control unit 16 determines the order of the target injection points U so that the movement route V is the shortest.
  • control unit 16 determines the target injection points U that are closest to each other in order.
  • the control unit 16 may cause the estimated movement route V to be superimposed on the three-dimensional image and displayed on the display unit 14. Thereby, operators, such as a medical worker, can grasp
  • the control unit 16 surrounds the long axis O in which the movement path V is directed from the aortic valve AV (see FIG. 4) to the apex AP (see FIG. 4) in the left ventricle LV of the heart.
  • the order of the target injection points U may be determined so as to draw a spiral.
  • the movement path V is a path that advances in the left ventricle LV from the front aortic valve side toward the back apex side along the circumferential direction M, so that the operation of the injection member can be performed. It can be made easier.
  • the control unit 16 causes the target injection point U to reciprocate along the long axis O from the aortic valve AV to the apex AP in the left ventricle LV of the heart.
  • the order may be determined.
  • the possibility that the movement of the injection member is hindered by the papillary muscle located along the long axis O in the left ventricle LV can be reduced. It is possible to reduce the trapping on the accompanying chords.
  • FIG. 8 is a diagram showing a state of treatment by the injection member.
  • FIG. 8 shows a state where the catheter 50 extends from the femoral artery FA through the aorta AO to the aortic valve AV which is the entrance of the left ventricle LV of the heart lumen.
  • the infusion member is delivered through the catheter 50 to the left ventricle LV.
  • the catheter 50 is not limited to the femoral artery FA, and may extend from the radial artery of the wrist to the aortic valve AV, for example.
  • the ultrasonic image generating device 20 is located on the body surface of the subject, takes a first tomographic image at any time, and transmits it to the image processing device 10.
  • the ultrasonic image generation device 20 acquires the position information of the distal end portion of the injection member as needed and transmits it to the image processing device 10.
  • the control part 16 of the image processing apparatus 10 can display the three-dimensional image which followed the position of the front-end
  • the ultrasonic image generation device 20 may be photographed not only from the body surface but also from the esophagus, blood vessel, and heart lumen (atrium, ventricle). However, it is preferable that the ultrasound image generation apparatus 20 captures images from the body surface in that non-invasive treatment can be performed.
  • the control unit 16 causes the display unit 14 to display the target injection point U that has been subjected to the injection treatment of the administration target by the injection member among the plurality of target injection points U in a manner different from the untreated target injection point U. May be.
  • the control unit 16 determines that the target injection point U has been treated based on, for example, the input of a signal indicating that the target injection point U has been treated via the operation input unit 13.
  • the control unit 16 may determine the treated target injection point U based on the newly input first tomographic image.
  • the image processing apparatus 10 can determine the positions of the plurality of target injection points U where the administration target should be injected into the abnormal site R ′, so that more specific simulation of the treatment is performed before the treatment is performed. It can be performed. Since the image processing apparatus 10 displays the target injection point U in a manner based on the order in which treatment should be performed, the treatment in a predetermined order can be guided to the operator.
  • the present disclosure relates to an image processing apparatus, an image processing system, and an image processing method.
  • Image processing system 10 Image processing device 11: Image input unit 12: Heartbeat input unit 13: Operation input unit 14: Display unit 15: Storage unit 16: Control unit 20: Ultrasonic image generation device (first imaging device) 21: Ultrasonic transmitter 22: Ultrasonic receiver 23: Image forming unit 30: Radiation image generating device (second imaging device) 31: Radiation emitting unit 32: Radiation detecting unit 33: Image forming unit 40: Heart rate acquisition device 50: Catheter AO: Aortic AP: Apex AV: Aortic valve BV: Blood vessel FA: Femoral artery LV: Left ventricle M: Circumferential direction O : Long axis P: Low motion site Q: Infarct site R: Target site R ': Abnormal site S: Penetration region T: Injection point U: Target injection point V: Movement path

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Robotics (AREA)
  • Optics & Photonics (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

This image processing device comprises: a storage unit that stores a three-dimensional image of the heart, the image including a heart abnormality site; and a control unit that estimates a permeation region to which an administered substance will permeate when the administered substance is injected into a given injection point in the abnormality site.

Description

画像処理装置、画像処理システム及び画像処理方法Image processing apparatus, image processing system, and image processing method
 本開示は、画像処理装置、画像処理システム及び画像処理方法に関する。 The present disclosure relates to an image processing apparatus, an image processing system, and an image processing method.
 現在、心不全などの治療において、細胞等の生体物質又はバイオマテリアル等の被投与物を組織に注入し、治療効果を期待する治療が検討されている。このような手技では、カテーテル等の器具が組織注入用に用いられている。このようなカテーテル等を用いた細胞治療にあたっては、注入手技前に心臓の心室等の生体組織を3Dマッピング等することで梗塞巣の位置を特定している。その後、梗塞巣と正常な心筋組織の境界部等の治療に応じた所望箇所に向けて被投与物としての細胞等を注入することが行われている。例えば、特許文献1には、超音波画像等から心臓の壁運動低下部位を異常部位として推定し、診断用画像を作成することが記載されている。 Currently, in the treatment of heart failure and the like, treatments that are expected to have a therapeutic effect by injecting biological materials such as cells or administration materials such as biomaterials into tissues are being studied. In such procedures, instruments such as catheters are used for tissue injection. In cell therapy using such a catheter or the like, the position of the infarct is specified by performing 3D mapping or the like on a living tissue such as a heart ventricle before the injection procedure. Thereafter, injecting cells or the like as the administration target toward a desired location according to the treatment such as the boundary between the infarct and the normal myocardial tissue is performed. For example, Patent Document 1 describes that a diagnosis is made by estimating a portion of the heart wall motion decreased from an ultrasonic image or the like as an abnormal portion.
特開2009-106530号公報JP 2009-106530 A
 しかしながら、従来の技術では、治療を行う前に治療のシミュレーションを行うことができなかった。 However, with the conventional technology, it was not possible to simulate treatment before treatment.
 本開示の目的は、上記問題に鑑み、治療を行う前に治療のシミュレーションを行うことができる画像処理装置、画像処理システム及び画像処理方法を提供することである。 In view of the above problems, an object of the present disclosure is to provide an image processing apparatus, an image processing system, and an image processing method capable of performing treatment simulation before performing treatment.
 本発明の第1の態様としての画像処理装置は、心臓の異常部位を含む前記心臓の3次元画像を記憶する記憶部と、前記異常部位の任意の注入点に被投与物が注入される場合に前記被投与物が浸透する浸透領域を推定する制御部と、を備える。 The image processing apparatus according to the first aspect of the present invention includes a storage unit that stores a three-dimensional image of the heart including an abnormal part of the heart, and a subject to be injected at an arbitrary injection point of the abnormal part. And a control unit that estimates a permeation region through which the administration target permeates.
 本発明の一実施形態としての画像処理装置は、前記記憶部が、前記被投与物の物性情報を記憶し、前記制御部が、前記被投与物の物性情報に基づいて前記浸透領域を推定する。 In the image processing apparatus as one embodiment of the present invention, the storage unit stores physical property information of the administration target, and the control unit estimates the penetration region based on the physical property information of the administration target. .
 本発明の一実施形態としての画像処理装置は、前記記憶部が、前記被投与物の投与量を記憶し、前記制御部が、前記被投与物の投与量に基づいて前記浸透領域を推定する。 In the image processing apparatus as one embodiment of the present invention, the storage unit stores a dose of the administration target, and the control unit estimates the penetration region based on the administration dose of the administration target. .
 本発明の一実施形態としての画像処理装置は、前記制御部が、前記3次元画像に基づいて前記心臓の壁厚を推定し、当該壁厚に基づいて前記浸透領域を推定する。 In the image processing apparatus as one embodiment of the present invention, the control unit estimates the wall thickness of the heart based on the three-dimensional image, and estimates the penetration region based on the wall thickness.
 本発明の一実施形態としての画像処理装置は、前記制御部が、前記3次元画像に基づいて前記心臓内の血管の位置を推定し、当該血管の位置に対する前記注入点の位置に基づいて前記浸透領域を推定する。 In the image processing apparatus as an embodiment of the present invention, the control unit estimates the position of a blood vessel in the heart based on the three-dimensional image, and based on the position of the injection point with respect to the position of the blood vessel, Estimate the penetration area.
 本発明の一実施形態としての画像処理装置は、前記記憶部が、所定時間ごとに撮像された複数の前記3次元画像を記憶し、前記制御部が、前記複数の3次元画像の時間変化に基づいて前記浸透領域を推定する。 In the image processing apparatus as an embodiment of the present invention, the storage unit stores a plurality of the three-dimensional images captured at predetermined time intervals, and the control unit is configured to change the plurality of three-dimensional images over time. Based on this, the penetration area is estimated.
 本発明の一実施形態としての画像処理装置は、前記記憶部が、前記被投与物を注入する注入部材の形状情報を記憶し、前記制御部が、前記注入部材の形状情報に基づいて前記浸透領域を推定する。 In the image processing apparatus according to an embodiment of the present invention, the storage unit stores shape information of an injection member that injects the administration target, and the control unit performs the penetration based on the shape information of the injection member. Estimate the region.
 本発明の第2の態様としての画像処理システムは、体外から心臓の第1断層画像を撮像する第1撮像装置と、体外から心臓の第2断層画像を撮像する第2撮像装置と、画像処理装置と、を含み、前記画像処理装置は、前記第1断層画像及び前記第2断層画像の入力を受け付ける画像入力部と、前記第1断層画像及び前記第2断層画像に基づいて、心臓の異常部位を含む前記心臓の3次元画像を生成し、前記異常部位の任意の注入点に被投与物が注入される場合に前記被投与物が浸透する浸透領域を推定する制御部と、を備える。 An image processing system according to a second aspect of the present invention includes a first imaging device that captures a first tomographic image of the heart from outside the body, a second imaging device that captures a second tomographic image of the heart from outside the body, and image processing. An image input device that receives input of the first tomographic image and the second tomographic image, and an abnormality of the heart based on the first tomographic image and the second tomographic image. A control unit that generates a three-dimensional image of the heart including a site and estimates a permeation region through which the administration material penetrates when the administration material is injected at an arbitrary injection point of the abnormal site.
 本発明の第3の態様としての画像処理方法は、画像処理装置を用いて実行される画像処理方法であって、心臓の異常部位を含む前記心臓の3次元画像を記憶する3次元画像記憶工程と、前記異常部位の任意の注入点に被投与物が注入される場合に前記被投与物が浸透する浸透領域を推定する浸透領域推定工程と、を含む。 An image processing method according to a third aspect of the present invention is an image processing method executed using an image processing apparatus, and stores a three-dimensional image of the heart including an abnormal part of the heart. And an infiltration region estimation step of estimating an infiltration region into which the administration object penetrates when the administration object is injected into an arbitrary injection point of the abnormal site.
 本発明の第4の態様としての画像処理装置は、心臓の異常部位を含む前記心臓の3次元画像を記憶する記憶部と、前記3次元画像を表示可能な表示部と、前記3次元画像に基づいて前記異常部位に被投与物を注入する複数の目標注入点の位置を決定し、前記複数の目標注入点を前記3次元画像に重畳して前記表示部に表示させる制御部と、を備える。 An image processing apparatus according to a fourth aspect of the present invention includes a storage unit that stores a three-dimensional image of the heart including an abnormal part of the heart, a display unit that can display the three-dimensional image, and a three-dimensional image. A control unit that determines positions of a plurality of target injection points for injecting the administration target into the abnormal site based on the three-dimensional image and displays the target injection points on the display unit. .
 本発明の一実施形態としての画像処理装置は、前記制御部が、前記複数の目標注入点の順番を決定し、前記順番に基づく態様で前記複数の目標注入点を前記表示部に表示させる。 In the image processing apparatus as an embodiment of the present invention, the control unit determines the order of the plurality of target injection points, and causes the display unit to display the plurality of target injection points in a manner based on the order.
 本発明の一実施形態としての画像処理装置は、前記制御部が、前記被投与物を注入する注入部材の先端部が前記複数の目標注入点を経由して移動する移動経路を推定し、当該移動経路に基づいて前記順番を決定する。 In the image processing apparatus as one embodiment of the present invention, the control unit estimates a moving path along which the distal end portion of the injection member for injecting the administration object moves via the plurality of target injection points, The order is determined based on the movement route.
 本発明の一実施形態としての画像処理装置は、前記制御部が、前記異常部位の所定の注入点に前記被投与物が注入される場合に前記被投与物が浸透する浸透領域を推定し、当該浸透領域に基づいて前記複数の目標注入点の位置を決定する。 In the image processing apparatus as an embodiment of the present invention, the control unit estimates an infiltration region into which the administration object penetrates when the administration object is injected into a predetermined injection point of the abnormal site, Based on the permeation region, the positions of the plurality of target injection points are determined.
 本発明の第5の態様としての画像処理システムは、体外から心臓の第1断層画像を撮像する第1撮像装置と、体外から心臓の第2断層画像を撮像する第2撮像装置と、画像処理装置と、を含み、前記画像処理装置は、前記第1断層画像及び前記第2断層画像の入力を受け付ける画像入力部と、表示部と、前記第1断層画像及び前記第2断層画像に基づいて、心臓の異常部位を含む前記心臓の3次元画像を生成し、前記3次元画像に基づいて前記異常部位に被投与物を注入する複数の目標注入点の位置を決定し、前記複数の目標注入点を前記3次元画像に重畳して前記表示部に表示させる制御部と、を備える。 An image processing system according to a fifth aspect of the present invention includes a first imaging device that captures a first tomographic image of the heart from outside the body, a second imaging device that captures a second tomographic image of the heart from outside the body, and image processing. The image processing device based on the image input unit that receives input of the first tomographic image and the second tomographic image, a display unit, the first tomographic image, and the second tomographic image. Generating a three-dimensional image of the heart including an abnormal part of the heart, determining positions of a plurality of target injection points for injecting a substance to be administered into the abnormal part based on the three-dimensional image, and the plurality of target injections A control unit that causes the display unit to display a point superimposed on the three-dimensional image.
 本発明の第6の態様としての画像処理方法は、画像処理装置を用いて実行される画像処理方法であって、心臓の異常部位を含む前記心臓の3次元画像を記憶する3次元画像記憶工程と、前記3次元画像を表示する3次元画像表示工程と、前記3次元画像に基づいて前記異常部位に被投与物を注入する複数の目標注入点の位置を決定する目標注入点決定工程と、前記複数の目標注入点を前記3次元画像に重畳して表示する目標注入点表示工程と、を含む。 An image processing method according to a sixth aspect of the present invention is an image processing method executed using an image processing apparatus, and stores a three-dimensional image of the heart including an abnormal part of the heart. A three-dimensional image display step for displaying the three-dimensional image; a target injection point determination step for determining the positions of a plurality of target injection points for injecting the administration target into the abnormal site based on the three-dimensional image; And a target injection point display step of displaying the plurality of target injection points in a superimposed manner on the three-dimensional image.
 本開示の画像処理装置、画像処理システム及び画像処理方法によると、治療を行う前に治療のシミュレーションを行うことができる。 According to the image processing device, the image processing system, and the image processing method of the present disclosure, it is possible to perform treatment simulation before performing treatment.
本発明の一実施形態としての画像処理装置を含む画像処理システムの概略構成を示すブロック図である。1 is a block diagram illustrating a schematic configuration of an image processing system including an image processing apparatus as an embodiment of the present invention. 図1に示す画像処理装置が行う画像処理の概要を示すフローチャートである。3 is a flowchart illustrating an overview of image processing performed by the image processing apparatus illustrated in FIG. 1. 図1に示す画像処理装置が行う目標部位特定処理の詳細を示すフローチャートである。It is a flowchart which shows the detail of the target part specific process which the image processing apparatus shown in FIG. 1 performs. 図1に示す画像処理装置が行う目標部位特定処理に伴う画像処理を説明する図である。It is a figure explaining the image process accompanying the target part specific process which the image processing apparatus shown in FIG. 1 performs. 図1に示す画像処理装置が行う浸透領域推定処理で推定される浸透領域の一例を示す模式図である。It is a schematic diagram which shows an example of the osmosis | permeation area | region estimated by the osmosis | permeation area estimation process which the image processing apparatus shown in FIG. 1 performs. 図1に示す画像処理装置が行う目標注入点決定処理の詳細を示すフローチャートである。It is a flowchart which shows the detail of the target injection | pouring point determination process which the image processing apparatus shown in FIG. 1 performs. 図1に示す画像処理装置が行う目標注入点決定処理で決定される目標注入点の一例を示す模式図である。It is a schematic diagram which shows an example of the target injection | pouring point determined by the target injection | pouring point determination process which the image processing apparatus shown in FIG. 1 performs. 注入部材による治療の様子を示す図である。It is a figure which shows the mode of the treatment by an injection | pouring member.
 以下、本発明の一実施形態について、図面を参照して説明する。各図において共通の構成部には、同一の符号を付している。 Hereinafter, an embodiment of the present invention will be described with reference to the drawings. In each figure, the same code | symbol is attached | subjected to the common structure part.
[画像処理システム1]
 図1は、本発明の一実施形態としての画像処理装置10を含む画像処理システム1の概略構成を示すブロック図である。図1に示すように、画像処理システム1は、画像処理装置10と、第1撮像装置としての超音波画像生成装置20と、第2撮像装置としての放射線画像生成装置30と、心拍取得装置40とを備える。
[Image processing system 1]
FIG. 1 is a block diagram showing a schematic configuration of an image processing system 1 including an image processing apparatus 10 as an embodiment of the present invention. As illustrated in FIG. 1, the image processing system 1 includes an image processing device 10, an ultrasonic image generation device 20 as a first imaging device, a radiation image generation device 30 as a second imaging device, and a heartbeat acquisition device 40. With.
 第1撮像装置としての超音波画像生成装置20は、被検者の体外に位置し、被検者の体外から心臓の第1断層画像としての超音波画像を撮像する。超音波画像生成装置20は、超音波を発信する超音波発信部21と、超音波を受信する超音波受信部22と、超音波受信部22が受信した超音波に基づいて第1断層画像を形成する画像形成部23と、を備える。超音波画像生成装置20は、超音波発信部21及び超音波受信部22を被検体の体表面に接触させた状態で、超音波発信部21から被検体の心臓に向けて超音波を発信し、被検体の心臓から反射した超音波を超音波受信部22で受信する。超音波画像生成装置20は、超音波受信部22で受信した超音波を画像形成部23で処理することで、超音波の進行面に沿った断層画像を第1断層画像として得る。超音波画像生成装置20は、撮像した第1断層画像を、画像処理装置10の画像入力部11に出力する。 The ultrasound image generation device 20 as the first imaging device is located outside the body of the subject and captures an ultrasound image as a first tomographic image of the heart from outside the subject. The ultrasonic image generation device 20 generates an ultrasonic transmission unit 21 that transmits ultrasonic waves, an ultrasonic reception unit 22 that receives ultrasonic waves, and a first tomographic image based on the ultrasonic waves received by the ultrasonic reception unit 22. And an image forming unit 23 to be formed. The ultrasonic image generation apparatus 20 transmits ultrasonic waves from the ultrasonic transmission unit 21 toward the subject's heart in a state where the ultrasonic transmission unit 21 and the ultrasonic reception unit 22 are in contact with the body surface of the subject. The ultrasonic wave reception unit 22 receives the ultrasonic wave reflected from the heart of the subject. The ultrasonic image generating apparatus 20 obtains a tomographic image along the ultrasonic traveling surface as a first tomographic image by processing the ultrasonic wave received by the ultrasonic wave receiving unit 22 by the image forming unit 23. The ultrasonic image generation apparatus 20 outputs the captured first tomographic image to the image input unit 11 of the image processing apparatus 10.
 超音波画像生成装置20は、超音波発信部21及び超音波受信部22の位置又は向きを変更させて異なる面に沿って撮像した複数の断層画像に基づいて、3次元画像を第1断層画像として生成してもよい。すなわち、第1断層画像は、1つの面に沿って撮像された断層画像でもよいし、複数の面に沿って撮像された複数の断層画像に基づいて生成された3次元画像でもよい。 The ultrasonic image generation apparatus 20 changes the position or orientation of the ultrasonic transmission unit 21 and the ultrasonic reception unit 22 and changes the position or orientation of the ultrasonic image generation unit 21 and the ultrasonic reception unit 22 into a first tomographic image based on a plurality of tomographic images captured along different planes. May be generated as That is, the first tomographic image may be a tomographic image captured along one plane, or may be a three-dimensional image generated based on a plurality of tomographic images captured along a plurality of planes.
 第2撮像装置としての放射線画像生成装置30は、被検者の体外に位置し、被検者の体外から心臓の第2断層画像としての放射線画像を撮像する。放射線画像生成装置30は、例えばコンピュータ断層撮影(CT:Computed Tomography)装置である。放射線画像生成装置30は、放射線を射出する放射線射出部31と、放射線を検出する放射線検出部32と、放射線検出部32が検出した放射線に基づいて第2断層画像を形成する画像形成部33と、を備える。放射線画像生成装置30は、被検者の周囲で互いに対向する位置に放射線射出部31及び放射線検出部32を備え、放射線射出部31及び放射線検出部32を被検者の周囲で回転させながら、放射線射出部31から被検者の心臓に向けてX線等の放射線を射出し、被検者の心臓を通過した放射線を放射線検出部32で検出する。放射線画像生成装置30は、放射線検出部32で検出した放射線を画像形成部33で処理することで、心臓の3次元画像である放射線画像を第2断層画像として得る。放射線画像生成装置30は、撮像した第2断層画像を、画像処理装置10の画像入力部11に出力する。 The radiation image generating device 30 as a second imaging device is located outside the body of the subject and captures a radiation image as a second tomographic image of the heart from outside the subject. The radiation image generation apparatus 30 is, for example, a computed tomography (CT) apparatus. The radiation image generating apparatus 30 includes a radiation emitting unit 31 that emits radiation, a radiation detecting unit 32 that detects radiation, and an image forming unit 33 that forms a second tomographic image based on the radiation detected by the radiation detecting unit 32. . The radiation image generating apparatus 30 includes a radiation emitting unit 31 and a radiation detecting unit 32 at positions facing each other around the subject, and while rotating the radiation emitting unit 31 and the radiation detecting unit 32 around the subject, Radiation such as X-rays is emitted from the radiation emitting unit 31 toward the subject's heart, and the radiation that has passed through the subject's heart is detected by the radiation detection unit 32. The radiological image generation apparatus 30 obtains a radiological image that is a three-dimensional image of the heart as a second tomographic image by processing the radiation detected by the radiation detection unit 32 using the image forming unit 33. The radiation image generation apparatus 30 outputs the captured second tomographic image to the image input unit 11 of the image processing apparatus 10.
 第2撮像装置は、放射線画像生成装置30に代えて、磁気共鳴画像生成(MRI:Magnetic Resonance Imaging)装置であってもよい。磁気共鳴画像生成装置は、被検者の体外に位置し、被検者の体外から心臓の第2断層画像としての磁気共鳴画像を撮像する。磁気共鳴画像生成装置は、磁場を発生させる磁場発生部と、核磁気共鳴信号を受信する信号受信部と、信号受信部が受信した核磁気共鳴信号に基づいて3次元画像である磁気共鳴画像を第2断層画像として形成する画像形成部と、を備える。 The second imaging device may be a magnetic resonance image generation (MRI) device instead of the radiation image generation device 30. The magnetic resonance image generation apparatus is located outside the subject's body and captures a magnetic resonance image as a second tomographic image of the heart from outside the subject's body. The magnetic resonance image generating apparatus includes a magnetic field generating unit that generates a magnetic field, a signal receiving unit that receives a nuclear magnetic resonance signal, and a magnetic resonance image that is a three-dimensional image based on the nuclear magnetic resonance signal received by the signal receiving unit. An image forming unit formed as a second tomographic image.
 第2断層画像が第2撮像装置としての放射線画像生成装置30又は磁気共鳴画像生成装置によって撮像されるより所定時間前に、造影剤が被検者の心臓に投与される。これにより、第2撮像装置によって撮像される第2断層画像は、遅延造影像を含む。 A contrast agent is administered to the subject's heart a predetermined time before the second tomographic image is captured by the radiation image generation device 30 or the magnetic resonance image generation device as the second imaging device. Accordingly, the second tomographic image captured by the second imaging device includes a delayed contrast image.
 第2撮像装置は、放射線画像生成装置30又は磁気共鳴画像生成装置に代えて、シンチグラフィ検査、SPECT(Single Photon Emission Computed Tomography)検査、PET(Positoron Emission Tomography)検査等を行う核医学検査装置であってもよい。核医学検査装置は、被検者の体外に位置し、被検者の体外から心臓の第2断層画像としての放射性同位元素(RI:ラジオアイソトープ)分布画像を取得する。核医学検査装置は、予め被検者に投与された放射性同位元素で標識された薬剤の分布を画像化することで、第2断層画像を取得する。 The second imaging apparatus is a nuclear medicine inspection apparatus that performs scintigraphy inspection, SPECT (Single Photon Emission Computed Tomography) inspection, PET (Positoron Emission Tomography) inspection, and the like, instead of the radiation image generation apparatus 30 or the magnetic resonance image generation apparatus. There may be. The nuclear medicine inspection apparatus is located outside the body of the subject, and acquires a radioisotope (RI: radioisotope) distribution image as a second tomographic image of the heart from outside the subject. The nuclear medicine examination apparatus obtains the second tomographic image by imaging the distribution of the drug labeled with the radioisotope previously administered to the subject.
 心拍取得装置40は、被検者の心臓の拍動情報を取得する。心臓の拍動情報は、心臓の拍動の時間変化の情報を含む。心拍取得装置40は、第1断層画像又は第2断層画像の撮像と同時に取得し、当該画像と対応付けてもよい。心拍取得装置40は、例えば、被検者の胸部又は手足に装着した電極を介して心臓活動電位の時間変化を計測し、心電図波形を連続して表示する心電図モニタである。 The heartbeat acquisition device 40 acquires heart beat information of the subject. The heart beat information includes time change information of the heart beat. The heartbeat acquisition device 40 may acquire the first tomographic image or the second tomographic image at the same time as the image and associate it with the image. The heartbeat acquisition device 40 is, for example, an electrocardiogram monitor that measures temporal changes in cardiac action potential via electrodes attached to the subject's chest or limbs and continuously displays an electrocardiogram waveform.
 画像処理装置10は、被検者の体外に位置し、コンピュータ等の情報処理装置によって構成される。画像処理装置10は、画像入力部11と、心拍入力部12と、操作入力部13と、表示部14と、記憶部15と、制御部16とを備える。 The image processing apparatus 10 is located outside the body of the subject and is configured by an information processing apparatus such as a computer. The image processing apparatus 10 includes an image input unit 11, a heartbeat input unit 12, an operation input unit 13, a display unit 14, a storage unit 15, and a control unit 16.
 画像入力部11は、第1撮像装置としての超音波画像生成装置20から第1画像の入力を受け付ける。画像入力部11は、第2撮像装置としての放射線画像生成装置30から第2画像の入力を受け付ける。画像入力部11は、例えば、有線通信又は無線通信により、超音波画像生成装置20及び放射線画像生成装置30から情報を受信するインターフェースを含む。画像入力部11は、入力された画像の情報を、制御部16に出力する。 The image input unit 11 receives an input of the first image from the ultrasonic image generation device 20 as the first imaging device. The image input unit 11 receives an input of the second image from the radiation image generation device 30 as the second imaging device. The image input unit 11 includes an interface that receives information from the ultrasonic image generation device 20 and the radiation image generation device 30 by, for example, wired communication or wireless communication. The image input unit 11 outputs information on the input image to the control unit 16.
 心拍入力部12は、心拍取得装置40から心臓の拍動情報の入力を受け付ける。心拍入力部12は、例えば、有線通信又は無線通信により、心拍取得装置40から情報を受信するインターフェースを含む。心拍入力部12は、入力された心臓の拍動情報を、制御部16に出力する。 The heartbeat input unit 12 receives input of heart beat information from the heartbeat acquisition device 40. The heartbeat input unit 12 includes an interface that receives information from the heartbeat acquisition device 40 by, for example, wired communication or wireless communication. The heartbeat input unit 12 outputs the input heart beat information to the control unit 16.
 操作入力部13は、例えばキーボード、マウス、又はタッチパネルを含む。操作入力部13がタッチパネルを含む場合、タッチパネルは表示部14と一体に設けられていてもよい。操作入力部13は、入力された情報を、制御部16に出力する。 The operation input unit 13 includes, for example, a keyboard, a mouse, or a touch panel. When the operation input unit 13 includes a touch panel, the touch panel may be provided integrally with the display unit 14. The operation input unit 13 outputs the input information to the control unit 16.
 表示部14は、制御部16からの信号に基づいて、第1断層画像、第2断層画像、及びこれらに基づいて制御部16が生成する画像等を表示する。表示部14は、例えば液晶ディスプレイ又は有機ELディスプレイなどの表示デバイスを含む。 The display unit 14 displays a first tomographic image, a second tomographic image, and an image generated by the control unit 16 based on these based on a signal from the control unit 16. The display unit 14 includes a display device such as a liquid crystal display or an organic EL display.
 記憶部15は、制御部16に特定の機能を実行させるための種々の情報及びプログラムを記憶する。記憶部15は、例えば心臓の3次元画像を記憶する。心臓の3次元画像は、第1断層画像、第2断層画像、又はこれらに基づいて制御部16が後述する目標部位特定処理で生成した表示情報である。心臓の3次元画像は、心臓の異常部位R’(図5及び図7参照)を含む。心臓の異常部位R’は、例えば、後述する目標部位特定処理で制御部16によって特定される目標部位R(図4(c)参照)である。記憶部15は、例えば異なる時刻に撮像された複数の断層画像に基づく複数の3次元画像を記憶する。記憶部15は、例えば後述する注入部材を用いた治療で異常部位R’に注入される被投与物の投与量及び物性情報を記憶する。記憶部15は、例えば注入部材の形状情報を記憶する。記憶部15は、例えばRAM又はROM等の記憶装置を含む。 The storage unit 15 stores various information and programs for causing the control unit 16 to execute a specific function. The storage unit 15 stores, for example, a three-dimensional image of the heart. The three-dimensional image of the heart is the first tomographic image, the second tomographic image, or display information generated by the control unit 16 based on these in a target part specifying process described later. The three-dimensional image of the heart includes an abnormal portion R ′ (see FIGS. 5 and 7) of the heart. The abnormal part R ′ of the heart is, for example, a target part R (see FIG. 4C) specified by the control unit 16 in a target part specifying process described later. For example, the storage unit 15 stores a plurality of three-dimensional images based on a plurality of tomographic images captured at different times. The storage unit 15 stores, for example, the dose and physical property information of the administration target to be injected into the abnormal site R ′ by treatment using an injection member described later. The storage unit 15 stores, for example, shape information of the injection member. The storage unit 15 includes a storage device such as a RAM or a ROM.
 制御部16は、画像処理装置10を構成する各構成部の動作を制御する。制御部16は、特定のプログラムを読み込むことにより特定の機能を実行する。具体的に、制御部16は、第1断層画像及び第2断層画像に基づいて、表示情報を生成する。制御部16は、生成した表示情報を表示部14に表示させる。制御部16は、生成した表示情報を外部の表示装置に出力してもよい。制御部16は、例えばプロセッサを含む。 The control unit 16 controls the operation of each component that constitutes the image processing apparatus 10. The control unit 16 executes a specific function by reading a specific program. Specifically, the control unit 16 generates display information based on the first tomographic image and the second tomographic image. The control unit 16 causes the display unit 14 to display the generated display information. The control unit 16 may output the generated display information to an external display device. The control unit 16 includes a processor, for example.
 制御部16は、第2断層画像が放射線画像生成装置30又は磁気共鳴画像生成装置によって撮像される場合、表示情報を、第1断層画像を第2断層画像に基づいて補正して生成してもよい。例えば、制御部16は、第1断層画像中の特徴点及び第2断層画像中の特徴点をそれぞれパターン認識等で検出し、第1断層画像中の特徴点を含む領域を対応する特徴点を含む第2断層画像中の領域で置換することで、第1断層画像を第2断層画像に基づいて補正した表示情報を生成することができる。これにより、第1断層画像をより高精細な第2断層画像で補正することができるため、心臓の構造及び形状の情報をより正確に示すことができる。 When the second tomographic image is captured by the radiation image generating device 30 or the magnetic resonance image generating device, the control unit 16 may generate display information by correcting the first tomographic image based on the second tomographic image. Good. For example, the control unit 16 detects a feature point in the first tomographic image and a feature point in the second tomographic image by pattern recognition or the like, and selects a feature point corresponding to a region including the feature point in the first tomographic image. By replacing the region in the second tomographic image including the display information, the display information obtained by correcting the first tomographic image based on the second tomographic image can be generated. As a result, the first tomographic image can be corrected with a higher-definition second tomographic image, so that the structure and shape information of the heart can be shown more accurately.
[画像処理の概要]
 図2は、画像処理装置10が行う画像処理の概要を示すフローチャートである。図2に示すように、画像処理装置10は、まず、目標部位特定処理を行う(ステップS10)。画像処理装置10は、次に、浸透領域推定処理を行う(ステップS20)。画像処理装置10は、最後に、目標注入点決定処理を行う(ステップS30)。
[Image processing overview]
FIG. 2 is a flowchart showing an outline of image processing performed by the image processing apparatus 10. As shown in FIG. 2, the image processing apparatus 10 first performs a target part specifying process (step S10). Next, the image processing apparatus 10 performs penetration region estimation processing (step S20). Finally, the image processing apparatus 10 performs target injection point determination processing (step S30).
[目標部位特定処理]
 図3は、画像処理装置10が行う目標部位特定処理の詳細を示すフローチャートである。図4は、画像処理装置10が行う目標部位特定処理に伴う画像処理を説明する図であり、心臓の左心室LVの断面を示す図である。図4(a)に示すように、画像処理装置10の制御部16は、画像入力部11を介して入力された第1断層画像を読み出し、第1断層画像に基づいて心臓の低運動部位Pを推定する(ステップS11:低運動部位推定工程)。具体的に、画像入力部11は、所定時間ごとに撮像された複数の第1断層画像の入力を受け付ける。そして、制御部16は、複数の第1断層画像の時間変化に基づいて低運動部位Pを推定する。更に詳細には、制御部16は、まず、第1断層画像中で輝度が所定値以上である複数の点を特徴点として抽出する。制御部16は、心筋が最も拡張した拡張期と、最も収縮した収縮期とを含む異なる時刻に撮像された複数の第1断層画像について、それぞれ複数の特徴点を抽出する。制御部16は、ある任意の特徴点と、隣接する他の特徴点との距離を、拡張期の第1断層画像及び収縮期の第1断層画像でそれぞれ測定した変化率を算出し、算出された変化率を心臓の3次元画像に反映させる。例えば、制御部16は、変化率が所定の閾値以下である領域と、変化率が所定の閾値を超える領域とが、異なる態様(例えば異なる色)となるように心臓の3次元画像に反映させる。制御部16は、変化率が所定の閾値以下である領域に対応する心臓の部位を低運動部位Pであると推定する。変化率の所定の閾値は、例えば12%であるが、設定によって適宜変更することが可能であってよい。
[Target site identification process]
FIG. 3 is a flowchart showing details of the target part specifying process performed by the image processing apparatus 10. FIG. 4 is a diagram for explaining image processing accompanying the target region specifying process performed by the image processing apparatus 10, and is a diagram showing a cross section of the left ventricle LV of the heart. As shown in FIG. 4A, the control unit 16 of the image processing apparatus 10 reads the first tomographic image input via the image input unit 11, and based on the first tomographic image, the low-motion site P of the heart. Is estimated (step S11: low motion site estimation step). Specifically, the image input unit 11 receives an input of a plurality of first tomographic images captured every predetermined time. And the control part 16 estimates the low exercise | movement site | part P based on the time change of several 1st tomographic image. More specifically, the control unit 16 first extracts a plurality of points whose luminance is a predetermined value or more in the first tomographic image as feature points. The control unit 16 extracts a plurality of feature points from a plurality of first tomographic images captured at different times including the diastole in which the myocardium is most dilated and the systole in which the myocardium is most deflated. The control unit 16 calculates a change rate obtained by measuring the distance between an arbitrary feature point and another adjacent feature point in the first tomographic image in the diastole and the first tomographic image in the systole. The change rate is reflected in the three-dimensional image of the heart. For example, the control unit 16 reflects in the three-dimensional image of the heart so that the region where the change rate is equal to or less than a predetermined threshold and the region where the change rate exceeds the predetermined threshold have different modes (for example, different colors). . The control unit 16 estimates that the part of the heart corresponding to the region where the rate of change is equal to or less than a predetermined threshold is the low motion part P. The predetermined threshold value of the change rate is, for example, 12%, but may be changed as appropriate depending on the setting.
 図4(b)に示すように、制御部16は、画像入力部11を介して入力された第2断層画像を読み出し、第2断層画像に基づいて心臓の梗塞部位Qを推定する(ステップS12:梗塞部位推定工程)。梗塞部位Qは、心筋が虚血状態になり壊死した部位である。梗塞部位Qは、上述の変化率が所定の閾値以下であり、低運動部位Pに含まれる。具体的に、制御部16は、第2断層画像が遅延造影像を含む場合、第2断層画像の遅延造影像に基づいて梗塞部位Qを推定する。詳細には、制御部16は、遅延造影像が写り込んだ部位を梗塞部位Qであると推定する。制御部16は、第2断層画像が放射性同位元素分布画像である場合、放射性同位元素の分布に基づいて梗塞部位Qを推定する。詳細には、制御部16は、放射性同位元素が集積していない集積欠損部位を梗塞部位Qであると推定する。制御部16は、梗塞部位推定工程(ステップS12)を、上述の低運動部位推定工程(ステップS11)よりも先に実行してもよい。 As shown in FIG. 4B, the control unit 16 reads out the second tomographic image input via the image input unit 11, and estimates the infarct region Q of the heart based on the second tomographic image (step S12). : Infarct site estimation step). The infarct site Q is a site where the myocardium is ischemic and necrotic. The infarct region Q is included in the low-motion region P because the change rate described above is equal to or less than a predetermined threshold value. Specifically, when the second tomographic image includes a delayed contrast image, the control unit 16 estimates the infarct region Q based on the delayed contrast image of the second tomographic image. Specifically, the control unit 16 estimates that the part where the delayed contrast image is reflected is the infarcted part Q. When the second tomographic image is a radioisotope distribution image, the control unit 16 estimates the infarct site Q based on the radioisotope distribution. Specifically, the control unit 16 estimates that an accumulation defect site where radioisotopes are not accumulated is the infarct site Q. The control unit 16 may execute the infarct site estimation step (step S12) before the above-described low motion site estimation step (step S11).
 図4(c)に示すように、制御部16は、低運動部位推定工程(ステップS11)で推定した低運動部位Pのうち、梗塞部位推定工程(ステップS12)で推定した梗塞部位Q以外の部位を、目標部位Rとして特定する(ステップS13:目標部位特定工程)。目標部位Rは、上述の変化率が所定の閾値以下であるが、壊死してはいない部位であり、冬眠心筋又は気絶心筋である。制御部16は、特定した目標部位Rを第1断層画像又は第2断層画像に重畳した表示情報を生成する。目標部位Rは、冬眠心筋と気絶心筋とを含むが、互いに独立して存在する。冬眠心筋は慢性的な虚血状態であり、気絶心筋は急性的な虚血状態である。気絶心筋は血流再開による過負荷により生じる。そのため、過負荷の状態を生じさせて、それを解消させることによって、気絶心筋の部位を特定することができる。それにより、気絶心筋と冬眠心筋を選定することができる。 As shown in FIG.4 (c), the control part 16 is other than the infarct site Q estimated by the infarct site estimation process (step S12) among the low exercise sites P estimated by the low exercise site estimation process (step S11). The part is specified as the target part R (step S13: target part specifying step). The target site R is a site where the above change rate is equal to or less than a predetermined threshold but is not necrotic, and is a hibernating myocardium or a stunned myocardium. The control unit 16 generates display information in which the identified target region R is superimposed on the first tomographic image or the second tomographic image. The target region R includes the hibernating myocardium and the stunned myocardium, but exists independently of each other. The hibernating myocardium is a chronic ischemic state and the stunned myocardium is an acute ischemic state. Stunned myocardium is caused by overload due to resumption of blood flow. Therefore, the region of stunned myocardium can be specified by causing an overload condition and eliminating it. Thereby, stunned myocardium and hibernating myocardium can be selected.
 心臓は拍動に伴って収縮と拡張とを繰り返すため、低運動部位推定工程(ステップS11)で用いる第1断層画像での心臓の伸縮状態、及び、梗塞部位推定工程(ステップS12)で用いる第2断層画像での心臓の伸縮状態は、同じ又は近い状態であることが好ましい。そこで、制御部16は、第2断層画像での心臓の伸縮状態に対応する第1断層画像を複数の第1断層画像から選択し、選択された第1断層画像を用いて目標部位Rを特定する。第1断層画像における心臓の伸縮状態は、第1断層画像から特徴点をパターン認識等で検出し、当該特徴点の位置情報に基づいて推定してもよい。同様に、第2断層画像における心臓の伸縮状態は、第2断層画像から特徴点をパターン認識等で検出し、当該特徴点の位置情報に基づいて推定してもよい。特徴点は、例えば心尖部AP又は大動脈弁AV等を含む。第1断層画像及び第2断層画像における心臓の伸縮状態は、心拍入力部12を介して入力された心臓の拍動情報に基づいて決定してもよい。詳細には、第1断層画像及び第2断層画像には、撮像された時点での心臓の拍動情報が対応付けられており、第1断層画像及び第2断層画像における心臓の伸縮状態は、それぞれ対応付けられた拍動情報によって決定される。 Since the heart repeatedly contracts and dilates as the heart beats, the heart stretches and contracts in the first tomographic image used in the low motion region estimation step (step S11) and the infarct region estimation step (step S12). The stretched state of the heart in the two tomographic images is preferably the same or close. Therefore, the control unit 16 selects a first tomographic image corresponding to the expansion / contraction state of the heart in the second tomographic image from the plurality of first tomographic images, and specifies the target region R using the selected first tomographic image. To do. The expansion / contraction state of the heart in the first tomographic image may be estimated based on position information of the feature point by detecting a feature point from the first tomographic image by pattern recognition or the like. Similarly, the expansion / contraction state of the heart in the second tomographic image may be estimated based on position information of the feature point by detecting a feature point from the second tomographic image by pattern recognition or the like. The feature points include, for example, the apex AP or the aortic valve AV. The expansion / contraction state of the heart in the first tomographic image and the second tomographic image may be determined based on heart beat information input via the heartbeat input unit 12. Specifically, the first tomographic image and the second tomographic image are associated with pulsation information of the heart at the time of imaging, and the stretched state of the heart in the first tomographic image and the second tomographic image is Each is determined by the associated beat information.
 上記のように、画像処理装置10は、目標部位Rとして、治療効果の比較的高い冬眠心筋又は気絶心筋を特定することができるので、治療効果の向上に寄与することができる。 As described above, since the image processing apparatus 10 can specify the hibernating myocardium or stunned myocardium having a relatively high therapeutic effect as the target region R, it can contribute to the improvement of the therapeutic effect.
[浸透領域推定処理]
 図5は、画像処理装置10が行う浸透領域推定処理で推定される浸透領域Sの一例を示す模式図である。図5は、心臓の左心室LVの心臓壁断面を示す図であり、異常部位R’内に位置する浸透領域Sの範囲を示している。制御部16は、記憶部15に記憶された心臓の3次元画像に含まれる異常部位R’の任意の注入点Tに被投与物が注入されると仮定した場合に、当該被投与物が浸透する浸透領域Sを推定する(浸透領域推定工程)。制御部16は、推定した浸透領域Sを3次元画像に重畳した表示情報を生成する。心臓の異常部位R’は、例えば、上述の目標部位特定処理で特定された目標部位Rである。被投与物は、例えば細胞等の生体物質又はバイオマテリアル等の物質である。浸透領域Sは、被投与物が注入されてから被投与物の効果が得られる時間内で所定時間経過した後の領域である。
[Penetration area estimation processing]
FIG. 5 is a schematic diagram showing an example of the penetration region S estimated by the penetration region estimation process performed by the image processing apparatus 10. FIG. 5 is a view showing a cross section of the heart wall of the left ventricle LV of the heart, and shows the range of the permeation region S located in the abnormal region R ′. When it is assumed that the administration target is injected at an arbitrary injection point T of the abnormal site R ′ included in the three-dimensional image of the heart stored in the storage unit 15, the control unit 16 penetrates the administration target. The permeation region S to be estimated is estimated (permeation region estimation step). The control unit 16 generates display information in which the estimated penetration region S is superimposed on the three-dimensional image. The abnormal part R ′ of the heart is, for example, the target part R specified by the above-described target part specifying process. The administration target is a biological substance such as a cell or a substance such as a biomaterial. The permeation region S is a region after a predetermined time has elapsed within a time when the effect of the administered product is obtained after the administered product is injected.
 制御部16は、例えば、3次元画像に基づいて心臓内の血管BVの位置を推定し、血管BVの位置に対する注入点Tの位置に基づいて浸透領域Sを推定する。異常部位R’に注入される被投与物は、血管BVの近くでは血流の影響により、血管BVの方向に浸透しやすいと考えられる。従って、図5に示すように、制御部16は、注入点Tが血管BVに近いほど、浸透領域Sが血管BVの方向に延在するように推定する。制御部16は、例えば、3次元画像に基づいて梗塞部位Qの位置を推定し、梗塞部位Qの位置に対する注入点Tの位置に基づいて、浸透領域Sを推定する。異常部位R’に注入される被投与物は、梗塞部位Qの近くでは例えば血流又は拍動等の心臓の活動が低下しているため、梗塞部位Qの方向に浸透しにくいと考えられる。従って、図5に示すように、制御部16は、注入点Tが梗塞部位Qに近いほど、浸透領域Sが梗塞部位Qの方向への延在が妨げられると推定する。 For example, the control unit 16 estimates the position of the blood vessel BV in the heart based on the three-dimensional image, and estimates the penetration region S based on the position of the injection point T with respect to the position of the blood vessel BV. It is considered that the administration substance injected into the abnormal site R ′ is likely to penetrate in the direction of the blood vessel BV near the blood vessel BV due to the influence of blood flow. Therefore, as illustrated in FIG. 5, the control unit 16 estimates that the infiltration region S extends in the direction of the blood vessel BV as the injection point T is closer to the blood vessel BV. For example, the control unit 16 estimates the position of the infarct site Q based on the three-dimensional image, and estimates the penetration region S based on the position of the injection point T with respect to the position of the infarct site Q. It is considered that the administration substance injected into the abnormal site R ′ is less likely to penetrate in the direction of the infarct site Q because the heart activity such as blood flow or pulsation is reduced near the infarct site Q. Therefore, as shown in FIG. 5, the control unit 16 estimates that the infiltration region S is prevented from extending in the direction of the infarct site Q as the injection point T is closer to the infarct site Q.
 制御部16は、記憶部15に記憶された被投与物の投与量及び物性情報に基づいて、浸透領域Sを推定してもよい。詳細には、制御部16は、被投与物の投与量が多いほど、浸透領域Sが大きくなると推定する。制御部16は、3次元画像に基づいて心臓の部位ごとの壁厚を推定し、当該壁厚に基づいて浸透領域Sを推定してもよい。詳細には、制御部16は、注入点T付近の壁厚が薄いほど、浸透領域Sが心臓壁に沿って広くなると推定する。制御部16は、記憶部15に記憶された複数の3次元画像の時間変化に基づいて、浸透領域Sを推定してもよい。詳細には、制御部16は、複数の3次元画像における特徴点の位置の時間変化を検出し、当該特徴点の位置の時間変化に基づいて、心臓壁の部位ごとの拍動等による動きを推定する。そして、動きの大きい部位ほど、浸透領域Sが大きくなると推定する。制御部16は、記憶部15に記憶された注入部材の形状情報に基づいて、浸透領域Sを推定してもよい。注入部材は、例えば針状の部材で構成され、周囲に被投与物を排出するための側孔が形成されている。注入部材の形状情報としては、例えば、注入部材の外形(直線形状、湾曲形状、らせん形状等)、径の大きさ、側孔の位置、側孔の大きさ等が挙げられる。 The control unit 16 may estimate the permeation region S based on the administration dose and physical property information stored in the storage unit 15. Specifically, the control unit 16 estimates that the permeation region S increases as the dose of the administration target increases. The control unit 16 may estimate the wall thickness for each part of the heart based on the three-dimensional image, and may estimate the penetration region S based on the wall thickness. Specifically, the control unit 16 estimates that the penetration region S becomes wider along the heart wall as the wall thickness near the injection point T is thinner. The control unit 16 may estimate the permeation region S based on temporal changes of a plurality of three-dimensional images stored in the storage unit 15. Specifically, the control unit 16 detects temporal changes in the positions of feature points in a plurality of three-dimensional images, and based on the temporal changes in the positions of the feature points, motions due to pulsations and the like for each part of the heart wall are detected. presume. And it estimates that the penetration | infiltration area | region S becomes large, so that a site | part with large motion. The control unit 16 may estimate the infiltration region S based on the shape information of the injection member stored in the storage unit 15. The injection member is composed of, for example, a needle-like member, and a side hole for discharging the administration target is formed around the injection member. The shape information of the injection member includes, for example, the outer shape (linear shape, curved shape, spiral shape, etc.), diameter size, side hole position, side hole size, etc. of the injection member.
 上記のように、画像処理装置10は、異常部位R’の任意の注入点Tに注入された被投与物が浸透する浸透領域Sを事前に推定することができるので、治療を行う前に治療のシミュレーションを行うことができる。 As described above, the image processing apparatus 10 can estimate in advance the permeation region S into which the administration target injected at an arbitrary injection point T of the abnormal region R ′ permeates. Can be simulated.
[目標注入点決定処理]
 図6は、画像処理装置10が行う目標注入点決定処理の詳細を示すフローチャートである。図7は、画像処理装置10が行う目標注入点決定処理で決定される目標注入点Uの一例を示す模式図である。図7は、心臓の左心室LVを、大動脈弁AV(図4参照)から心尖部AP(図4参照)の方向に見た断面図である。制御部16は、記憶部15に記憶された3次元画像を読み出して、表示部14に表示させる(ステップS31:3次元画像表示工程)。制御部16は、3次元画像に基づいて、異常部位R’に被投与物を注入すべき複数の目標注入点Uの位置を決定する(ステップS32:目標注入点決定工程)。制御部16は、決定した複数の目標注入点Uを、3次元画像に重畳して表示部14に表示させる(ステップS33:目標注入点表示工程)。目標注入点Uの位置は、心臓壁の内面から壁厚方向に沿う深さの情報を含む。換言すると、目標注入点Uは、心臓壁の内面からどの位置にどれくらいの深さで被投与物を注入すべきかを示す。目標注入点Uの位置は、例えば、上述の浸透領域推定処理で推定された浸透領域Sに基づいて決定される。詳細には、制御部16は、複数の注入点Tについての浸透領域Sをそれぞれ推定し、推定された複数の浸透領域Sに基づいて、被投与物を注入すべき注入点Tを目標注入点Uに決定する。制御部16は、例えば、他の複数の浸透領域Sに内包される浸透領域Sに対応する注入点Tを特定する。そして、特定された注入点T以外の注入点Tを、目標注入点Uに決定する。これにより、目標注入点Uに被投与物を注入することで、目標注入点Uに注入された被投与物による浸透領域Sが異常部位R’をより効率的に埋め尽くすことになる。
[Target injection point determination process]
FIG. 6 is a flowchart showing details of target injection point determination processing performed by the image processing apparatus 10. FIG. 7 is a schematic diagram illustrating an example of the target injection point U determined by the target injection point determination process performed by the image processing apparatus 10. FIG. 7 is a cross-sectional view of the left ventricle LV of the heart as viewed from the aortic valve AV (see FIG. 4) to the apex AP (see FIG. 4). The control unit 16 reads out the three-dimensional image stored in the storage unit 15 and displays it on the display unit 14 (step S31: three-dimensional image display step). Based on the three-dimensional image, the control unit 16 determines the positions of a plurality of target injection points U where the administration target should be injected into the abnormal site R ′ (step S32: target injection point determination step). The control unit 16 causes the display unit 14 to display the determined plurality of target injection points U superimposed on the three-dimensional image (step S33: target injection point display step). The position of the target injection point U includes information on the depth along the wall thickness direction from the inner surface of the heart wall. In other words, the target injection point U indicates at what position from the inner surface of the heart wall and how deep the administration target should be injected. The position of the target injection point U is determined based on, for example, the penetration region S estimated by the above-described penetration region estimation process. Specifically, the control unit 16 estimates the permeation areas S for a plurality of injection points T, and based on the estimated plurality of permeation areas S, determines the injection point T where the administration target should be injected as the target injection point. U is determined. For example, the control unit 16 specifies the injection point T corresponding to the permeation region S included in the other plurality of permeation regions S. Then, the injection point T other than the specified injection point T is determined as the target injection point U. Accordingly, by injecting the administration target into the target injection point U, the permeation region S due to the administration injected into the target injection point U fills the abnormal site R ′ more efficiently.
 制御部16は、複数の目標注入点Uの順番を決定する。制御部16は、決定された順番に基づく態様で複数の目標注入点Uを表示部14に表示させる。制御部16は、例えば、図7に示すように、決定された順番を目標注入点Uに併記させる。制御部16は、例えば、次の順番の目標注入点Uのみを表示させる。制御部16は、被投与物を注入する注入部材の先端部が複数の目標注入点Uを経由して移動する移動経路Vを推定し、移動経路Vに基づいて目標注入点Uの順番を決定する。制御部16は、例えば移動経路Vが最短となるように、目標注入点Uの順番を決定する。詳細には、制御部16は、互いに最も近い目標注入点Uが順番となるように決定する。制御部16は、推定した移動経路Vを、3次元画像に重畳して表示部14に表示させてもよい。これにより、医療従事者等の操作者は、目標注入点Uの順番に従った最適な注入部材の動かし方を把握することができる。 The control unit 16 determines the order of the plurality of target injection points U. The control unit 16 causes the display unit 14 to display a plurality of target injection points U in a manner based on the determined order. For example, as illustrated in FIG. 7, the control unit 16 causes the determined order to be written together with the target injection point U. For example, the control unit 16 displays only the target injection point U in the next order. The control unit 16 estimates the movement path V along which the tip of the injection member that injects the administration object moves via the plurality of target injection points U, and determines the order of the target injection points U based on the movement path V. To do. For example, the control unit 16 determines the order of the target injection points U so that the movement route V is the shortest. Specifically, the control unit 16 determines the target injection points U that are closest to each other in order. The control unit 16 may cause the estimated movement route V to be superimposed on the three-dimensional image and displayed on the display unit 14. Thereby, operators, such as a medical worker, can grasp | ascertain the optimal how to move the injection | pouring member according to the order of the target injection | pouring point U. FIG.
 図7(a)に示すように、制御部16は、移動経路Vが心臓の左心室LV内で大動脈弁AV(図4参照)から心尖部AP(図4参照)に向かう長軸Oの周囲で螺旋を描くように、目標注入点Uの順番を決定してもよい。これにより、移動経路Vは、左心室LV内で、周方向Mに沿って手前の大動脈弁側から奥の心尖部側に向かって途中で引き返すことなく進む経路となるので、注入部材の操作を行いやすくすることができる。 As shown in FIG. 7 (a), the control unit 16 surrounds the long axis O in which the movement path V is directed from the aortic valve AV (see FIG. 4) to the apex AP (see FIG. 4) in the left ventricle LV of the heart. The order of the target injection points U may be determined so as to draw a spiral. As a result, the movement path V is a path that advances in the left ventricle LV from the front aortic valve side toward the back apex side along the circumferential direction M, so that the operation of the injection member can be performed. It can be made easier.
 図7(b)に示すように、制御部16は、移動経路Vが心臓の左心室LV内で大動脈弁AVから心尖部APに向かう長軸Oに沿って往復するように、目標注入点Uの順番を決定してもよい。これにより、移動経路Vは、長軸Oに沿うため、左心室LV内に長軸Oに沿って位置する乳頭筋によって注入部材の移動が妨げられるおそれを低減することができ、僧帽弁に付随する腱索への引っ掛かりを低減することができる。 As shown in FIG. 7 (b), the control unit 16 causes the target injection point U to reciprocate along the long axis O from the aortic valve AV to the apex AP in the left ventricle LV of the heart. The order may be determined. Thereby, since the movement path V is along the long axis O, the possibility that the movement of the injection member is hindered by the papillary muscle located along the long axis O in the left ventricle LV can be reduced. It is possible to reduce the trapping on the accompanying chords.
 図8は、注入部材による治療の様子を示す図である。図8では、カテーテル50が大腿動脈FAから大動脈AOを通じて、心臓内腔の左心室LVの入口である大動脈弁AVまで延在した状態を示している。注入部材は、カテーテル50を通じて左心室LVまでデリバリーされる。カテーテル50は、大腿動脈FAからに限らず、例えば手首の橈骨動脈等から大動脈弁AVまで延在していてもよい。 FIG. 8 is a diagram showing a state of treatment by the injection member. FIG. 8 shows a state where the catheter 50 extends from the femoral artery FA through the aorta AO to the aortic valve AV which is the entrance of the left ventricle LV of the heart lumen. The infusion member is delivered through the catheter 50 to the left ventricle LV. The catheter 50 is not limited to the femoral artery FA, and may extend from the radial artery of the wrist to the aortic valve AV, for example.
 図8に示すように、超音波画像生成装置20は、被検者の体表面に位置し、第1断層画像を随時撮影し、画像処理装置10に送信する。超音波画像生成装置20は、注入部材の先端部の位置情報を随時取得し、画像処理装置10に送信する。これにより、画像処理装置10の制御部16は、例えば注入部材の先端部の位置に追従した3次元画像を表示情報として表示部14に表示させることができる。超音波画像生成装置20は、体表面に限らず、食道、血管、心臓内腔(心房、心室)から撮影してもよい。ただし、超音波画像生成装置20は、非侵襲な処置が行える点で、体表面から撮影することが好ましい。 As shown in FIG. 8, the ultrasonic image generating device 20 is located on the body surface of the subject, takes a first tomographic image at any time, and transmits it to the image processing device 10. The ultrasonic image generation device 20 acquires the position information of the distal end portion of the injection member as needed and transmits it to the image processing device 10. Thereby, the control part 16 of the image processing apparatus 10 can display the three-dimensional image which followed the position of the front-end | tip part of an injection | pouring member on the display part 14 as display information, for example. The ultrasonic image generation device 20 may be photographed not only from the body surface but also from the esophagus, blood vessel, and heart lumen (atrium, ventricle). However, it is preferable that the ultrasound image generation apparatus 20 captures images from the body surface in that non-invasive treatment can be performed.
 制御部16は、複数の目標注入点Uのうち、注入部材による被投与物の注入処置が済んだ目標注入点Uを、未処置の目標注入点Uとは異なる態様で表示部14に表示させてもよい。制御部16は、目標注入点Uが処置済であることを、例えば操作入力部13を介して処置済であることを示す信号が入力されたことに基づいて決定する。制御部16は、新たに入力された第1断層画像に基づいて、処置済の目標注入点Uを判別してもよい。 The control unit 16 causes the display unit 14 to display the target injection point U that has been subjected to the injection treatment of the administration target by the injection member among the plurality of target injection points U in a manner different from the untreated target injection point U. May be. The control unit 16 determines that the target injection point U has been treated based on, for example, the input of a signal indicating that the target injection point U has been treated via the operation input unit 13. The control unit 16 may determine the treated target injection point U based on the newly input first tomographic image.
 上記のように、画像処理装置10は、異常部位R’に被投与物を注入すべき複数の目標注入点Uの位置を決定することができるので、治療を行う前により具体的な治療のシミュレーションを行うことができる。画像処理装置10は、処置を行うべき順番に基づく態様で目標注入点Uを表示するので、所定の順番での処置を操作者に誘導することができる。 As described above, the image processing apparatus 10 can determine the positions of the plurality of target injection points U where the administration target should be injected into the abnormal site R ′, so that more specific simulation of the treatment is performed before the treatment is performed. It can be performed. Since the image processing apparatus 10 displays the target injection point U in a manner based on the order in which treatment should be performed, the treatment in a predetermined order can be guided to the operator.
 本開示は、上述した各実施形態で特定された構成に限定されず、特許請求の範囲に記載した内容を逸脱しない範囲内で種々の変形が可能である。例えば、各構成部、各ステップなどに含まれる機能などは論理的に矛盾しないように再配置可能であり、複数の構成部又はステップなどを1つに組み合わせたり、或いは分割したりすることが可能である。 The present disclosure is not limited to the configuration specified in each of the above-described embodiments, and various modifications can be made without departing from the contents described in the claims. For example, the functions included in each component, each step, etc. can be rearranged so that there is no logical contradiction, and a plurality of components, steps, etc. can be combined or divided into one It is.
 本開示は、画像処理装置、画像処理システム及び画像処理方法に関する。 The present disclosure relates to an image processing apparatus, an image processing system, and an image processing method.
1:画像処理システム
10:画像処理装置
11:画像入力部
12:心拍入力部
13:操作入力部
14:表示部
15:記憶部
16:制御部
20:超音波画像生成装置(第1撮像装置)
21:超音波発信部
22:超音波受信部
23:画像形成部
30:放射線画像生成装置(第2撮像装置)
31:放射線射出部
32:放射線検出部
33:画像形成部
40:心拍取得装置
50:カテーテル
AO:大動脈
AP:心尖部
AV:大動脈弁
BV:血管
FA:大腿動脈
LV:左心室
M:周方向
O:長軸
P:低運動部位
Q:梗塞部位
R:目標部位
R’:異常部位
S:浸透領域
T:注入点
U:目標注入点
V:移動経路
1: Image processing system 10: Image processing device 11: Image input unit 12: Heartbeat input unit 13: Operation input unit 14: Display unit 15: Storage unit 16: Control unit 20: Ultrasonic image generation device (first imaging device)
21: Ultrasonic transmitter 22: Ultrasonic receiver 23: Image forming unit 30: Radiation image generating device (second imaging device)
31: Radiation emitting unit 32: Radiation detecting unit 33: Image forming unit 40: Heart rate acquisition device 50: Catheter AO: Aortic AP: Apex AV: Aortic valve BV: Blood vessel FA: Femoral artery LV: Left ventricle M: Circumferential direction O : Long axis P: Low motion site Q: Infarct site R: Target site R ': Abnormal site S: Penetration region T: Injection point U: Target injection point V: Movement path

Claims (15)

  1.  心臓の異常部位を含む前記心臓の3次元画像を記憶する記憶部と、
     前記異常部位の任意の注入点に被投与物が注入される場合に前記被投与物が浸透する浸透領域を推定する制御部と、を備える画像処理装置。
    A storage unit for storing a three-dimensional image of the heart including an abnormal portion of the heart;
    An image processing apparatus comprising: a control unit that estimates a permeation region through which the administration target penetrates when the administration product is injected at an arbitrary injection point of the abnormal site.
  2.  前記記憶部は、前記被投与物の物性情報を記憶し、
     前記制御部は、前記被投与物の物性情報に基づいて前記浸透領域を推定する、請求項1に記載の画像処理装置。
    The storage unit stores physical property information of the administration target,
    The image processing apparatus according to claim 1, wherein the control unit estimates the penetration region based on physical property information of the administration target.
  3.  前記記憶部は、前記被投与物の投与量を記憶し、
     前記制御部は、前記被投与物の投与量に基づいて前記浸透領域を推定する、請求項1又は2に記載の画像処理装置。
    The storage unit stores a dose of the administration target,
    The image processing apparatus according to claim 1, wherein the control unit estimates the penetration region based on a dose of the administration target.
  4.  前記制御部は、前記3次元画像に基づいて前記心臓の壁厚を推定し、当該壁厚に基づいて前記浸透領域を推定する、請求項1から3のいずれか一項に記載の画像処理装置。 The image processing apparatus according to claim 1, wherein the control unit estimates a wall thickness of the heart based on the three-dimensional image, and estimates the penetration region based on the wall thickness. .
  5.  前記制御部は、前記3次元画像に基づいて前記心臓内の血管の位置を推定し、当該血管の位置に対する前記注入点の位置に基づいて前記浸透領域を推定する、請求項1から4のいずれか一項に記載の画像処理装置。 5. The control unit according to claim 1, wherein the control unit estimates a position of a blood vessel in the heart based on the three-dimensional image, and estimates the penetration region based on a position of the injection point with respect to the position of the blood vessel. An image processing apparatus according to claim 1.
  6.  前記記憶部は、所定時間ごとに撮像された複数の前記3次元画像を記憶し、
     前記制御部は、前記複数の3次元画像の時間変化に基づいて前記浸透領域を推定する、請求項1から5のいずれか一項に記載の画像処理装置。
    The storage unit stores a plurality of the three-dimensional images captured every predetermined time,
    The image processing apparatus according to claim 1, wherein the control unit estimates the penetration region based on a time change of the plurality of three-dimensional images.
  7.  前記記憶部は、前記被投与物を注入する注入部材の形状情報を記憶し、
     前記制御部は、前記注入部材の形状情報に基づいて前記浸透領域を推定する、請求項1から6のいずれか一項に記載の画像処理装置。
    The storage unit stores shape information of an injection member that injects the administration target,
    The image processing apparatus according to claim 1, wherein the control unit estimates the penetration region based on shape information of the injection member.
  8.  体外から心臓の第1断層画像を撮像する第1撮像装置と、
     体外から心臓の第2断層画像を撮像する第2撮像装置と、
     画像処理装置と、を含み、
     前記画像処理装置は、
      前記第1断層画像及び前記第2断層画像の入力を受け付ける画像入力部と、
      前記第1断層画像及び前記第2断層画像に基づいて、心臓の異常部位を含む前記心臓の3次元画像を生成し、前記異常部位の任意の注入点に被投与物が注入される場合に前記被投与物が浸透する浸透領域を推定する制御部と、を備える、画像処理システム。
    A first imaging device for imaging a first tomographic image of the heart from outside the body;
    A second imaging device for imaging a second tomographic image of the heart from outside the body;
    An image processing apparatus,
    The image processing apparatus includes:
    An image input unit for receiving input of the first tomographic image and the second tomographic image;
    Based on the first tomographic image and the second tomographic image, a three-dimensional image of the heart including an abnormal part of the heart is generated, and the administration target is injected at an arbitrary injection point of the abnormal part An image processing system comprising: a control unit that estimates a permeation region through which an administration target permeates.
  9.  画像処理装置を用いて実行される画像処理方法であって、
     心臓の異常部位を含む前記心臓の3次元画像を記憶する3次元画像記憶工程と、
     前記異常部位の任意の注入点に被投与物が注入される場合に前記被投与物が浸透する浸透領域を推定する浸透領域推定工程と、を含む画像処理方法。
    An image processing method executed using an image processing apparatus,
    A three-dimensional image storage step of storing a three-dimensional image of the heart including an abnormal portion of the heart;
    And an infiltration region estimation step of estimating an infiltration region into which the administration object penetrates when the administration object is injected at an arbitrary injection point of the abnormal site.
  10.  心臓の異常部位を含む前記心臓の3次元画像を記憶する記憶部と、
     前記3次元画像を表示可能な表示部と、
     前記3次元画像に基づいて前記異常部位に被投与物を注入する複数の目標注入点の位置を決定し、前記複数の目標注入点を前記3次元画像に重畳して前記表示部に表示させる制御部と、を備える画像処理装置。
    A storage unit for storing a three-dimensional image of the heart including an abnormal portion of the heart;
    A display unit capable of displaying the three-dimensional image;
    Control for determining positions of a plurality of target injection points for injecting the administration target into the abnormal site based on the three-dimensional image, and displaying the plurality of target injection points on the display unit in a superimposed manner on the three-dimensional image. An image processing apparatus.
  11.  前記制御部は、前記複数の目標注入点の順番を決定し、前記順番に基づく態様で前記複数の目標注入点を前記表示部に表示させる、請求項10に記載の画像処理装置。 The image processing apparatus according to claim 10, wherein the control unit determines an order of the plurality of target injection points and displays the plurality of target injection points on the display unit in a manner based on the order.
  12.  前記制御部は、前記被投与物を注入する注入部材の先端部が前記複数の目標注入点を経由して移動する移動経路を推定し、当該移動経路に基づいて前記順番を決定する、請求項11に記載の画像処理装置。 The control unit estimates a movement path along which a distal end portion of an injection member that injects the administration object moves via the plurality of target injection points, and determines the order based on the movement path. The image processing apparatus according to 11.
  13.  前記制御部は、前記異常部位の所定の注入点に前記被投与物が注入される場合に前記被投与物が浸透する浸透領域を推定し、当該浸透領域に基づいて前記複数の目標注入点の位置を決定する、請求項10から12のいずれか一項に記載の画像処理装置。 The control unit estimates an infiltration region into which the administration object penetrates when the administration object is injected into a predetermined injection point of the abnormal site, and the plurality of target injection points are determined based on the infiltration region. The image processing apparatus according to claim 10, wherein the position is determined.
  14.  体外から心臓の第1断層画像を撮像する第1撮像装置と、
     体外から心臓の第2断層画像を撮像する第2撮像装置と、
     画像処理装置と、を含み、
     前記画像処理装置は、
      前記第1断層画像及び前記第2断層画像の入力を受け付ける画像入力部と、
      表示部と、
      前記第1断層画像及び前記第2断層画像に基づいて、心臓の異常部位を含む前記心臓の3次元画像を生成し、前記3次元画像に基づいて前記異常部位に被投与物を注入する複数の目標注入点の位置を決定し、前記複数の目標注入点を前記3次元画像に重畳して前記表示部に表示させる制御部と、を備える、画像処理システム。
    A first imaging device for imaging a first tomographic image of the heart from outside the body;
    A second imaging device for imaging a second tomographic image of the heart from outside the body;
    An image processing apparatus,
    The image processing apparatus includes:
    An image input unit for receiving input of the first tomographic image and the second tomographic image;
    A display unit;
    Generating a three-dimensional image of the heart including an abnormal portion of the heart based on the first tomographic image and the second tomographic image, and injecting a substance to be administered into the abnormal portion based on the three-dimensional image; An image processing system comprising: a control unit that determines a position of a target injection point, and causes the display unit to display the plurality of target injection points superimposed on the three-dimensional image.
  15.  画像処理装置を用いて実行される画像処理方法であって、
     心臓の異常部位を含む前記心臓の3次元画像を記憶する3次元画像記憶工程と、
     前記3次元画像を表示する3次元画像表示工程と、
     前記3次元画像に基づいて前記異常部位に被投与物を注入する複数の目標注入点の位置を決定する目標注入点決定工程と、
     前記複数の目標注入点を前記3次元画像に重畳して表示する目標注入点表示工程と、を含む画像処理方法。
    An image processing method executed using an image processing apparatus,
    A three-dimensional image storage step of storing a three-dimensional image of the heart including an abnormal portion of the heart;
    A three-dimensional image display step for displaying the three-dimensional image;
    A target injection point determination step for determining positions of a plurality of target injection points for injecting the administration target into the abnormal site based on the three-dimensional image;
    A target injection point display step of displaying the plurality of target injection points in a superimposed manner on the three-dimensional image.
PCT/JP2018/018902 2017-05-16 2018-05-16 Image processing device, image processing system, and image processing method WO2018212231A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019518836A JPWO2018212231A1 (en) 2017-05-16 2018-05-16 Image processing apparatus, image processing system, and image processing method

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2017-097660 2017-05-16
JP2017097660 2017-05-16
JP2017-097661 2017-05-16
JP2017097661 2017-05-16

Publications (1)

Publication Number Publication Date
WO2018212231A1 true WO2018212231A1 (en) 2018-11-22

Family

ID=64274413

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/018902 WO2018212231A1 (en) 2017-05-16 2018-05-16 Image processing device, image processing system, and image processing method

Country Status (2)

Country Link
JP (1) JPWO2018212231A1 (en)
WO (1) WO2018212231A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040009459A1 (en) * 2002-05-06 2004-01-15 Anderson James H. Simulation system for medical procedures
WO2008050316A2 (en) * 2006-10-22 2008-05-02 Paieon Inc. Method and apparatus for positioning a therapeutic device in a tubular organ dilated by an auxiliary device balloon
JP2009106530A (en) * 2007-10-30 2009-05-21 Toshiba Corp Medical image processing apparatus, medical image processing method, and medical image diagnostic apparatus
US20110087110A1 (en) * 2009-10-13 2011-04-14 Cell Genetics, Llc Medical imaging processes for facilitating catheter-based delivery of therapy to affected organ tissue
JP2012165910A (en) * 2011-02-15 2012-09-06 Fujifilm Corp Surgery-assistance apparatus, method, and program
US20150257741A1 (en) * 2014-03-12 2015-09-17 Samsung Medison Co., Ltd. Method and ultrasound apparatus for displaying diffusion boundary of medicine

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040009459A1 (en) * 2002-05-06 2004-01-15 Anderson James H. Simulation system for medical procedures
WO2008050316A2 (en) * 2006-10-22 2008-05-02 Paieon Inc. Method and apparatus for positioning a therapeutic device in a tubular organ dilated by an auxiliary device balloon
JP2009106530A (en) * 2007-10-30 2009-05-21 Toshiba Corp Medical image processing apparatus, medical image processing method, and medical image diagnostic apparatus
US20110087110A1 (en) * 2009-10-13 2011-04-14 Cell Genetics, Llc Medical imaging processes for facilitating catheter-based delivery of therapy to affected organ tissue
JP2012165910A (en) * 2011-02-15 2012-09-06 Fujifilm Corp Surgery-assistance apparatus, method, and program
US20150257741A1 (en) * 2014-03-12 2015-09-17 Samsung Medison Co., Ltd. Method and ultrasound apparatus for displaying diffusion boundary of medicine

Also Published As

Publication number Publication date
JPWO2018212231A1 (en) 2020-03-19

Similar Documents

Publication Publication Date Title
US8428220B2 (en) Dynamical visualization of coronary vessels and myocardial perfusion information
US10010373B2 (en) Navigation system for cardiac therapies using gating
US6746401B2 (en) Tissue ablation visualization
US6923768B2 (en) Method and apparatus for acquiring and displaying a medical instrument introduced into a cavity organ of a patient to be examined or treated
JP6906113B2 (en) Devices, systems and methods for visualizing cyclically moving biological structures
US8311613B2 (en) Electrode catheter positioning system
US20070055142A1 (en) Method and apparatus for image guided position tracking during percutaneous procedures
US20110118590A1 (en) System For Continuous Cardiac Imaging And Mapping
US9072490B2 (en) Image processing apparatus and image processing method
US11191524B2 (en) Ultrasonic diagnostic apparatus and non-transitory computer readable medium
JP2017217474A (en) Medical image diagnostic apparatus and medical image processing system
CN114098780A (en) CT scanning method, device, electronic device and storage medium
JP7062004B2 (en) Image processing device, image processing system and image processing method
US10891710B2 (en) Image processing device, method, and program
US10888302B2 (en) Image processing device, method, and program
WO2018212231A1 (en) Image processing device, image processing system, and image processing method
US20190247066A1 (en) Treatment of myocardial infarction using sonothrombolytic ultrasound
CN110215203B (en) Electrocardiosignal acquisition method and device, computer equipment and storage medium
JPWO2018212249A1 (en) Image display device and image display device set
JP7271126B2 (en) Ultrasound diagnostic equipment and medical image processing equipment
US20220023663A1 (en) Radiotherapy support system and method
WO2018212248A1 (en) Image processing device and image processing method
JP2006000422A (en) X-ray diagnostic equipment, image data processor, and image data processing method
WO2009156894A1 (en) Method and system for cardiac resynchronization therapy

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18802310

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019518836

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18802310

Country of ref document: EP

Kind code of ref document: A1