WO2024071251A1 - Programme informatique, procédé de traitement d'informations, dispositif de traitement d'informations et modèle d'apprentissage - Google Patents

Programme informatique, procédé de traitement d'informations, dispositif de traitement d'informations et modèle d'apprentissage Download PDF

Info

Publication number
WO2024071251A1
WO2024071251A1 PCT/JP2023/035280 JP2023035280W WO2024071251A1 WO 2024071251 A1 WO2024071251 A1 WO 2024071251A1 JP 2023035280 W JP2023035280 W JP 2023035280W WO 2024071251 A1 WO2024071251 A1 WO 2024071251A1
Authority
WO
WIPO (PCT)
Prior art keywords
plaque
stent
image
lesion
information
Prior art date
Application number
PCT/JP2023/035280
Other languages
English (en)
Japanese (ja)
Inventor
貴則 富永
Original Assignee
テルモ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by テルモ株式会社 filed Critical テルモ株式会社
Publication of WO2024071251A1 publication Critical patent/WO2024071251A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/12Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters

Definitions

  • the present invention relates to a computer program, an information processing method, an information processing device, and a learning model for processing medical images.
  • Diagnostic medical catheters are used for diagnosing or treating lesions in hollow organs such as blood vessels and vasculature. Diagnostic medical catheters are equipped with ultrasonic sensors or light-receiving sensors and moved into the organ, and images based on signals obtained from the sensors are used for diagnosis.
  • Diagnostic imaging of blood vessels, particularly of hollow organs, is essential for the safe and reliable performance of procedures such as percutaneous coronary intervention (PCI).
  • PCI percutaneous coronary intervention
  • intravascular imaging techniques such as IVUS (Intravascular Ultrasound) and OCT (Optical Coherence Tomography) using medical catheters are becoming widespread.
  • Doctors and other medical professionals refer to medical images based on these imaging technologies to understand the condition of hollow organs, make diagnoses, and provide treatment.
  • various technologies have been proposed for generating and displaying information that assists in interpreting the medical images through image processing or calculation (Patent Document 1, etc.).
  • Medical professionals interpret the medical images to understand the anatomical characteristics of the patient's hollow organ and the condition of the affected area, and then perform treatment by expanding the blocked hollow organ itself using a balloon attached to the tip of a medical catheter used for treatment, or by placing a stent inside the hollow organ. At this time, it is desirable to output information based on the medical images that allows the extent of the affected area and the condition of the surrounding hollow organs to be accurately understood.
  • the purpose of this disclosure is to provide a computer program, information processing method, information processing device, and learning model that are capable of displaying appropriate information required for judgment regarding medical images.
  • the computer program of the present disclosure causes a computer to execute a process of calculating data indicating anatomical characteristics of a tubular organ based on a signal output from an imaging device provided on a catheter inserted into the tubular organ, identifying an area on the longitudinal axis of the tubular organ in which a lesion exists in the tubular organ based on the calculated data indicating anatomical characteristics, and outputting information for placing a stent in the tubular organ based on the identified area on the longitudinal axis of the lesion.
  • the computer is caused to execute a process of outputting information on the landing zone of the stent as information for placing the stent in the tubular organ.
  • the computer is caused to execute a process of outputting information on the position of a reference portion, which is a portion of the lumen that is larger before and after the range on the long axis of the lesion, as information for placing a stent in the tubular organ.
  • the computer is caused to execute a process of changing the position of the reference area based on information on other lesion areas in the vicinity of the range on the long axis of the lesion area, and outputting information on the position of the reference area after the change.
  • the computer is caused to execute a process of outputting a proposal for the size of the stent to be placed based on the data indicating the anatomical features and the position of the reference portion on the long axis.
  • the imaging device is a catheter device that includes a transmitter and a receiver for waves of different wavelengths.
  • the imaging device is a dual-type catheter device including a transmitter and a receiver for IVUS and OCT, respectively.
  • the lesions are different types of lesions including lipid plaque, fibrous plaque, and calcific plaque
  • the computer is caused to execute a process of identifying the longitudinal extent of each of the different types of lesions in the tubular organ based on a signal from the catheter device corresponding to the different types of lesions.
  • the computer is made to execute a process of calculating the distribution of plaque burden in the longitudinal direction from a tomographic image of the tubular organ based on a signal obtained from an IVUS sensor, identifying the position of lipid plaque or fibrous plaque in the longitudinal direction of the tubular organ based on a signal obtained from an OCT sensor, and outputting information for placing a stent based on the distribution of the plaque burden and the position of the lipid plaque or fibrous plaque.
  • the computer is caused to execute a process of calculating the distribution of plaque burden in the longitudinal direction from a tomographic image of the tubular organ based on a signal obtained from an IVUS sensor, identifying the position of lipid plaque in the longitudinal direction of the tubular organ based on a signal obtained from the IVUS sensor, identifying the position of lipid plaque or fibrous plaque in the longitudinal direction of the tubular organ based on a signal obtained from an OCT sensor, and outputting information for placing a stent based on the distribution of plaque burden and the position of the lipid plaque or fibrous plaque.
  • a computer acquires a signal output from an imaging device provided in a catheter inserted into a tubular organ, calculates data indicative of anatomical characteristics of the tubular organ based on the signal output from the imaging device provided in the catheter inserted into the tubular organ, identifies an area on the longitudinal axis of the tubular organ in which a lesion exists in the tubular organ based on the calculated data indicative of anatomical characteristics, and outputs information for placing a stent in the tubular organ based on the identified area on the longitudinal axis of the lesion.
  • An information processing device is an information processing device that acquires a signal output from an imaging device provided in a catheter inserted into a tubular organ, and includes a memory unit that stores a trained model that outputs data for distinguishing the range of tissue or lesions shown in a tomographic image of the tubular organ based on the signal when the tomographic image of the tubular organ based on the signal is input, and a processing unit that executes image processing based on the signal from the imaging device, and the processing unit inputs an image based on the signal output from the imaging device provided in the catheter inserted into the tubular organ to the model, calculates data indicating anatomical characteristics of the tubular organ based on the data output from the model, identifies the range on the long axis of the tubular organ where the lesion in the tubular organ exists based on the calculated data indicating the anatomical characteristics, and outputs information for placing a stent in the tubular organ based on the identified range on the long axis of the lesion.
  • the learning model according to the present disclosure includes an input layer to which data indicating anatomical characteristics of a tubular organ is inputted, the data relating to the distribution of the data in the longitudinal direction of the tubular organ, the output layer to output the suitability of a stent to be placed in a lesion of the tubular organ, and an intermediate layer trained based on teacher data including the distribution and the track record of the stent used in the lesion based on the distribution, and causes a computer to function in such a way that the distribution of data indicating anatomical characteristics of a tubular organ in the longitudinal direction of the tubular organ is provided to the input layer, calculation is performed based on the intermediate layer, and the suitability of the stent corresponding to the distribution is output from the output layer.
  • the present disclosure it is possible to output information regarding an appropriate position for placing a stent in a luminal organ based on data showing the anatomical characteristics of the luminal organ. This is expected to enable the appropriate selection of the size of the stent and improve the accuracy of diagnosis and treatment.
  • FIG. 1 is a schematic diagram of an imaging diagnostic device.
  • FIG. 13 is an explanatory diagram showing the operation of the catheter.
  • FIG. 1 is a block diagram showing a configuration of an image processing device.
  • FIG. 1 is a schematic diagram of a segmentation model.
  • 11 is a flowchart illustrating an example of an information processing procedure performed by the image processing device.
  • 11 is a flowchart illustrating an example of an information processing procedure performed by the image processing device.
  • 1 shows an example of a screen displayed on a display device.
  • 11A to 11C are diagrams showing a process of changing the position of a reference portion.
  • 11 shows another example of a screen displayed on the display device.
  • FIG. 13 is a block diagram showing a configuration of an image processing device according to a second embodiment.
  • FIG. 1 is a schematic diagram of a plaque detection model.
  • 10 is a flowchart illustrating an example of an information processing procedure by an image processing apparatus according to a second embodiment.
  • 10 is a flowchart illustrating an example of an information processing procedure by an image processing apparatus according to a second embodiment.
  • 1 shows an example of a screen displayed on a display device.
  • FIG. 13 is a block diagram showing a configuration of an image processing device according to a third embodiment.
  • FIG. 1 is a schematic diagram of a stent information model.
  • 1 is a flow chart illustrating an example of a process for generating a stent information model.
  • 13 is a flowchart illustrating an example of an information processing procedure by an image processing apparatus according to a third embodiment.
  • 13 is a flowchart illustrating an example of an information processing procedure by an image processing device according to a third embodiment.
  • 1 shows an example of a screen displayed on a display device.
  • First Embodiment 1 is a schematic diagram of an image diagnostic apparatus 100.
  • the image diagnostic apparatus 100 includes a catheter 1, an MDU (Motor Drive Unit) 2, an image processing device (information processing device) 3, a display device 4, and an input device 5.
  • MDU Motor Drive Unit
  • image processing device information processing device
  • display device 4, and an input device 5.
  • the catheter 1 is a flexible tube for medical use.
  • the catheter 1 is known as an imaging catheter, which has an imaging device 11 at its tip and rotates in a circumferential direction by being driven from its base end.
  • the imaging device 11 of the catheter 1 is a dual-type catheter device that includes a transmitter and a receiver of waves of different wavelengths (ultrasound, light).
  • the imaging device 11 includes an ultrasound probe including an ultrasound transducer and an ultrasound sensor for the IVUS method, and an OCT device including a near-infrared laser and a near-infrared sensor.
  • the OCT device is a device that includes an optical element with a lens function and a reflecting function at its tip, and may have a structure that guides light to a near-infrared laser and a near-infrared sensor connected via an optical fiber.
  • the target of the dual type is not limited to a combination of IVUS and OCT, but may also be near-infrared spectroscopy, etc.
  • the MDU 2 is a drive unit attached to the base end of the catheter 1, and controls the operation of the catheter 1 by driving the internal motor in response to the operation of the examination operator.
  • the image processing device 3 generates multiple medical images, such as cross-sectional images of blood vessels, based on the signal output from the imaging device 11 of the catheter 1.
  • the configuration of the image processing device 3 will be described in detail later.
  • the display device 4 uses a liquid crystal display panel, an organic EL (Electro Luminescence) display panel, or the like.
  • the display device 4 displays the medical images generated by the image processing device 3 and information related to the medical images.
  • the input device 5 is an input interface that accepts operations for the image processing device 3.
  • the input device 5 may be a keyboard, a mouse, etc., or may be a touch panel, soft keys, hard keys, etc. built into the display device 4.
  • the input device 5 may also accept operations based on voice input. In this case, the input device 5 uses a microphone and a voice recognition engine.
  • FIG. 2 is an explanatory diagram showing the operation of the catheter 1.
  • the catheter 1 is inserted into a tubular blood vessel L by an examination operator along a guide wire W inserted into the coronary artery shown in the figure.
  • the right side corresponds to the distal side from the insertion point of the catheter 1 and guide wire W
  • the left side corresponds to the proximal side.
  • the catheter 1 is driven by the MDU 2 to move from the distal end to the proximal end within the blood vessel L as shown by the arrow in the figure, and while rotating in the circumferential direction, the imaging device 11 scans the blood vessel in a spiral manner.
  • the image processing device 3 acquires signals for each scan output from the imaging device 11 of the catheter 1 for both IVUS and OCT.
  • One scan is a spiral scan in which a detection wave is emitted from the imaging device 11 in the radial direction and reflected light is detected.
  • the image processing device 3 generates a tomographic image (cross-sectional image) obtained by polar coordinate conversion (inverse conversion) for each 360 degrees for each IVUS and OCT rectangular image (I0 in FIG. 2) in which the signals for each scan are aligned in the radial direction for 360 degrees and arranged in a rectangular shape (I1 in FIG. 2).
  • the rectangular image generated based on the signal from IVUS is referred to as rectangular image I01
  • the tomographic image is referred to as tomographic image I11
  • the rectangular image generated based on the signal from OCT is distinguished from the rectangular image I02 and the tomographic image I12, which will be described below.
  • the tomographic images I11 and I12 are also called frame images.
  • the reference point (center) of the cross-sectional images I11 and I12 corresponds to the range of the catheter 1 (not imaged).
  • the image processing device 3 may further generate a long axis image (longitudinal cross-sectional image) in which the pixel values on a line passing through the reference points of each of the cross-sectional images I11 and I12 are arranged along the length (longitudinal direction) of the blood vessel by the catheter 1.
  • the image processing device 3 calculates data indicating the anatomical characteristics of the blood vessel based on the obtained rectangular images I01 and I02, cross-sectional images I11 and I12, or long axis image, and outputs the cross-sectional image I1 or the long axis image and the calculated data so that they can be viewed by a doctor, examination operator, or other medical personnel.
  • the image processing device 3 performs image processing on the rectangular images I01, I02, the tomographic images I11, I12, or the long axis images to output images that make it easier to grasp the anatomical features of the blood vessels and the state of the lesion. Specifically, the image processing device 3 outputs information for placing a stent in an area that includes the lesion. The output processing by the image processing device 3 will be described in detail below.
  • FIG. 3 is a block diagram showing the configuration of the image processing device 3.
  • the image processing device 3 is a computer, and includes a processing unit 30, a storage unit 31, and an input/output I/F 32.
  • the processing unit 30 includes one or more CPUs (Central Processing Units), MPUs (Micro-Processing Units), GPUs (Graphics Processing Units), GPGPUs (General-purpose computing on graphics processing units), TPUs (Tensor Processing Units), etc.
  • the processing unit 30 incorporates a non-temporary storage medium such as a RAM (Random Access Memory), and performs calculations based on a computer program P3 stored in the storage unit 31 while storing data generated during processing in the non-temporary storage medium.
  • a non-temporary storage medium such as a RAM (Random Access Memory)
  • the storage unit 31 is a non-volatile storage medium such as a hard disk or flash memory.
  • the storage unit 31 stores the computer program P3 read by the processing unit 30, setting data, etc.
  • the storage unit 31 also stores a trained segmentation model 31M.
  • the segmentation model 31M includes a first model 311M trained on the IVUS tomographic image I11 and a second model 312M trained on the OCT tomographic image I12.
  • the computer program P3 and the segmentation model 31M may be copies of the computer program P9 and the segmentation model 91M stored in a non-temporary storage medium 9 outside the device, read out via the input/output I/F 32.
  • the computer program P3 and the segmentation model 31M may be distributed by a remote server device, acquired by the image processing device 3 via a communication unit (not shown), and stored in the storage unit 31.
  • the input/output I/F 32 is an interface to which the catheter 1, the display device 4, and the input device 5 are connected.
  • the processing unit 30 acquires a signal (digital data) output from the imaging device 11 via the input/output I/F 32.
  • the processing unit 30 outputs screen data of a screen including the generated tomographic image I1 and/or long axis image to the display device 4 via the input/output I/F 32.
  • the processing unit 30 accepts operation information input to the input device 5 via the input/output I/F 32.
  • FIG. 4 is a schematic diagram of segmentation model 31M. Of first model 311M and second model 312M that constitute segmentation model 31M, FIG. 4 shows first model 311M. The configuration of second model 312M is the same as that of first model 311M except that the image to be learned is different, so illustration and detailed description are omitted.
  • the first model 311M and the second model 312M that make up the segmentation model 31M are each models that have been trained to output an image showing the area of one or more objects appearing in an image when the image is input.
  • the first model 311M is, for example, a model that performs semantic segmentation.
  • the first model 311M is designed to output an image in which each pixel in the input image is tagged with data indicating which object the pixel is in.
  • the first model 311M uses, for example, a so-called U-net in which a convolution layer, a pooling layer, an upsampling layer, and a softmax layer are symmetrically arranged, as shown in FIG. 4.
  • the first model 311M outputs a tag image IS1.
  • the tag image IS1 is obtained by tagging the pixels at the positions of the lumen range of the blood vessel, the membrane range corresponding to the area between the lumen boundary of the blood vessel including the tunica media and the blood vessel boundary, the range in which the guide wire W and its reflection are captured, and the range corresponding to the catheter 1, with different pixel values (shown by different types of hatching and solid color in FIG.
  • the first model 311M further identifies the range of lipid plaque formed in the blood vessel.
  • the first model 311M for IVUS identifies the area in which fibrous plaque or calcified plaque is captured. IVUS can distinguish between fibrous plaque and calcified plaque.
  • the first model 311M is exemplified by semantic segmentation and U-net, but it goes without saying that this is not limited to this.
  • the first model 311M may be a model that realizes individual recognition processing using instance segmentation, etc.
  • the first model 311M is not limited to being based on U-net, and may also use a model based on SegNet, R-CNN, or an integrated model with other edge extraction processing, etc.
  • the processing unit 30 identifies the blood (lumen area), intima area, media area, and adventitia area of the blood vessels shown in the tomographic image I11 based on pixel values in the tag image IS1 obtained by inputting the IVUS tomographic image I11 into the first model 311M and their coordinates within the image.
  • the processing unit 30 can detect the lumen boundary and blood vessel boundary of the blood vessels shown in the tomographic image I11. Strictly speaking, the blood vessel boundary is the external elastic membrane (EEM) between the tunica media and adventitia of the blood vessel.
  • EEM external elastic membrane
  • the processing unit 30 identifies the range of lipid plaque and each of fibrous plaque and calcified plaque based on the pixel values in the tag image IS1 obtained by inputting the tomographic image I11 into the first model 311M for IVUS and the coordinates within that image.
  • the processing unit 30 similarly inputs the OCT cross-sectional image I12 into the second model 312M, and based on the pixel values in the tag image IS2 obtained and the coordinates within that image, identifies the blood (lumen area), calcified plaque, fibrous plaque, and lipid plaque of the blood vessels shown in the cross-sectional image I12.
  • the imaging diagnostic device 100 of the present disclosure is used in diagnosis for placing a stent in a lesion in a blood vessel, which is a hollow organ.
  • the examination operator or medical provider scans the imaging device 11 to evaluate and diagnose the condition of the lesion.
  • the imaging diagnostic device 100 outputs information for determining the type and size of the balloon to be used, together with the scanning results of the imaging device 11, to the display device 4.
  • the examination operator or medical provider scans the imaging device 11 to evaluate the condition of the blood vessel opened by the balloon and determine the type of stent to be placed and where the stent should be placed.
  • the examination operator or medical provider places the stent of the determined type and size using the catheter 1.
  • the image processing device 3 generates and outputs information indicating the type of stent and the position at which the stent should be placed, based on information obtained by image processing of the IVUS tomographic image I11 and the OCT tomographic image I12. The process of outputting information about the stent is described below.
  • the processing unit 30 of the image processing device 3 identifies the luminal boundary of the vascular lumen range from each range identified for each of the IVUS tomographic image I11 and the OCT tomographic image I12, and calculates numerical values such as the maximum diameter, minimum diameter, and average inner diameter inside the lumen boundary. Furthermore, the processing unit 30 calculates the ratio of the cross-sectional area to the area inside the vascular boundary (hereinafter referred to as plaque burden) from the results of identifying the ranges of calcified plaque, fibrous plaque, and lipid plaque identified for each of the IVUS tomographic image I11 and the OCT tomographic image I12.
  • the image processing device 3 of the present disclosure outputs a graph of the distribution of the average lumen diameter and the distribution of plaque burden with respect to the position in the longitudinal direction of the blood vessel.
  • the image processing device 3 further outputs, on the graph showing the distribution, a reference portion to be referred to for placing a stent and candidates for the landing zone of the stent.
  • FIGS. 5 and 6 are flowcharts showing an example of the information processing procedure by the image processing device 3.
  • the processing unit 30 of the image processing device 3 starts the following processing when a signal is output from the imaging device 11 of the catheter 1.
  • the processing unit 30 generates tomographic images I11 and I12 (step S102) each time it acquires a predetermined amount (e.g., 360°) of signals (data) from the imaging device 11 of the catheter 1 for both IVUS and OCT (step S101).
  • the processing unit 30 performs polar coordinate conversion (inverse conversion) on the signals arranged in a rectangle for each of the IVUS and OCT to generate the tomographic images I11 and I12.
  • the processing unit 30 stores the signal data acquired in step S101 and the tomographic images I11 and I12 generated in step S102 for each of the IVUS and OCT in the memory unit 31 in association with positions on the long axis of the blood vessel (step S103).
  • the processing unit 30 inputs the IVUS tomographic image I11 to the first IVUS model 311M (step S104).
  • the processing unit 30 stores the region identification result (tag image IS1) output from the first IVUS model 311M in the memory unit 31 in association with the position on the long axis of the blood vessel (step S105).
  • the processing unit 30 inputs the OCT tomographic image I12 to the second model 312M for OCT (step S106).
  • the processing unit 30 stores the region identification result (tag image IS2) output from the second model 312M for OCT in the memory unit 31 in association with the position on the long axis of the blood vessel (step S107).
  • the processing unit 30 extracts necessary area images from the tomographic images I11 and I12 based on the area identification result for the IVUS tomographic image I11 (tag image IS1) and the area identification result for the OCT tomographic image I12 (tag image IS2) (step S108).
  • the processing unit 30 extracts, for example, area images of the membrane area corresponding to the media area and adventitia area, and area images of the lipid plaque area from the IVUS tomographic image I11, and extracts area images of the lumen area and area images of the fibrous plaque and calcified plaque areas from the OCT tomographic image I12. That is, the processing unit 30 appropriately extracts anatomical features and lesion areas from each of the IVUS and OCT images that allow clear area identification.
  • the processing unit 30 synthesizes the extracted area images to create a corrected tomographic image (step S109).
  • the processing unit 30 calculates the coordinate (angle) shift between the IVUS tomographic image I11 and the OCT tomographic image I12 so that they can be overlaid on each other without any problems.
  • the processing unit 30 calculates data indicating anatomical characteristics including the maximum, minimum and average inner diameter of the range inside the lumen boundary of the blood vessel and plaque burden for the corrected tomographic image (step S110).
  • the processing unit 30 stores the data indicating the anatomical characteristics calculated in step S110 (the average inner diameter inside the lumen boundary and the plaque burden, etc.) in the memory unit 31 in association with the position on the long axis of the blood vessel (step S111). In step S111, the processing unit 30 may also store the angle ranges of lipid plaque, fibrous plaque, or calcified plaque identified in the tomographic images I11 and I12.
  • the processing unit 30 determines whether scanning by the imaging device 11 of the catheter 1 has been completed (step S112). If it is determined that scanning has not been completed (S112: NO), the processing unit 30 returns the process to step S101 and generates the next tomographic images I11 and I12.
  • the processing unit 30 creates and outputs a graph showing the distribution of data indicating anatomical features for the entire longitudinal direction of the scanned blood vessel (step S113).
  • the processing unit 30 identifies the position and range of the lesion (plaque) on the long axis based on various data stored in the memory unit 31 in association with the position on the long axis of the blood vessel (step S114).
  • the processing unit 30 identifies, for example, the positions on the long axis where the plaque burden is equal to or greater than a set percentage threshold (e.g., 50%) for a continuous period of at least a set length threshold (e.g., 4 mm) as the range of plaque.
  • a set percentage threshold e.g. 50%
  • a set length threshold e.g., 4 mm
  • the processing unit 30 determines a reference area for the identified lesion based on the position and range of the lesion on the long axis (step S115). In step S115, the processing unit 30 determines the reference area as the area with the largest lumen diameter before and after the range of the lesion, within a predetermined range (10 mm) from the range of the lesion to the location where a large side branch is present. In step S115, if the reference area overlaps with the location where lipid plaque is present, the processing unit 30 re-determines as the reference area the location where the plaque burden is lowest outside the range of the lipid plaque.
  • the processing unit 30 stores the position of the determined reference area (step S116).
  • the processing unit 30 outputs to the display device 4 the position and range on the long axis of the lesion identified in step S114 and a graphic showing the reference area determined in step S115 on the graph displayed in step S113 (step S117).
  • the processing unit 30 ends the process.
  • Figure 7 shows an example of a screen 400 displayed on the display device 4.
  • the screen 400 shown in Figure 7 is displayed after scanning from the proximal to the distal end of the blood vessel is completed in order to output the type, size, and placement position of the stent.
  • Screen 400 includes cursor 401 indicating the position on the long axis of the blood vessel corresponding to tomographic image I11 or tomographic image I12 to be displayed, tomographic image I11 and tomographic image I12 generated based on the signal obtained at that position, and corrected tomographic image I3.
  • Screen 400 includes data column 402 displaying numerical values of data indicating anatomical features calculated by image processing of tomographic images I11, I12, and I3. Corrected tomographic image I3, IVUS tomographic image I11, and OCT tomographic image I12 may be displayed in a switched manner each time they are selected on screen 400.
  • the screen 400 in FIG. 7 includes a data column 402 that displays the numerical values of data indicating anatomical features calculated by image processing of the tomographic images I11, I12, and I3.
  • Screen 400 further includes graphs 403 and 404 showing the distribution of data indicating anatomical characteristics with respect to position on the long axis of the blood vessel.
  • Graph 403 shows the distribution of mean lumen diameter with respect to position on the long axis.
  • Graph 404 shows the distribution of plaque burden with respect to position on the long axis.
  • Graph 404 is displayed with graphic 405 superimposed on it.
  • Graphic 405 indicates the range in which the portion in which the plaque burden is equal to or greater than the percentage threshold (50% in this case) continues for 2 mm or more in the longitudinal direction. Examination operators and other medical personnel who visually view graph 404 with graphic 405 superimposed thereon can understand that in the blood vessels in the range in which graphic 405 is displayed, the plaque burden is equal to or greater than the threshold over a length of 2 mm or more.
  • Graph 404 displays, on its long axis, a graphic 406 indicating the presence of lipid plaque, and a graphic 407 indicating the presence of fibrous plaque or calcified plaque.
  • Graph 404 also displays a bar 408 indicating the position of the reference area relative to the lesion.
  • the examination operator or medical provider visually viewing the screen 400 shown in FIG. 7 can recognize the range where the plaque burden is less than the threshold and the average lumen diameter is large, and the range where the plaque burden is equal to or greater than the threshold and the average lumen diameter is small, by juxtaposing the graphs 403 and 404.
  • the examination operator or medical provider can determine what kind of balloon or stent should be used as information for expanding the lesion from the inside by regarding the range where the plaque burden is equal to or greater than the threshold and the average lumen diameter as the lesion. After expanding the lesion with the balloon, the examination operator or medical provider also checks the screen 400 again based on the signal obtained by scanning the catheter 1.
  • the examination operator or medical provider can determine where the stent should be placed by referring to the graphic 405 indicating the range of the lesion, the graphic 406 indicating the range of the lipid plaque, the graphic 407 indicating the range of the fibrous plaque, and the bar 408 indicating the reference area.
  • FIG. 8 is a diagram showing the process of changing the position of the reference area.
  • graphs 404 indicating the distribution of plaque burden with respect to the position on the long axis are shown above and below.
  • the upper graph 404 shows the state before the change, and the lower graph 404 shows the state after the change.
  • a reference portion is determined as a portion with the largest lumen diameter among positions on the long axis within 10 mm proximal to the lesion and within 10 mm distal to the lesion, covering the range of the hatched graphic 405, up to the location where a large side branch exists.
  • the image processing of the tomographic images I11 and I12 by the processing unit 30 identifies the presence of soft lipid plaque at the position once determined as the reference portion. If lipid plaque is present at the position of the reference portion once determined, the processing unit 30 re-determines a position within the same 10 mm range where the lumen diameter is the largest in the range where lipid plaque is not present.
  • the processing unit 30 does not need to avoid this position and re-determine it. This is because a location where calcified plaque exists may be more suitable as a location for placing a stent than a location where lipid plaque exists.
  • FIG. 7 a bar 408 indicating a reference portion is displayed.
  • the screen 400 may display a graphic indicating the landing zone of the stent as information for placing the stent in the tubular organ.
  • FIG. 9 shows another example of the screen 400 displayed on the display device 4.
  • FIG. 9 not only are bars 408 displayed indicating reference areas proximal and distal to the lesion, but also a graphic 409 is displayed indicating the landing zone, which is the area of contact on the long axis of the blood vessel that is used to secure the stent. Additionally, the screen 400 in FIG. 9 displays the dimension from the proximal end of the landing zone to the distal end. This allows the examination operator or medical provider to visually check the graphic 409 and its associated dimensions to determine what size stent should be placed and how.
  • the processing unit 30 of the image processing device 3 may output stent recommendation information on the screen 400 by referring to the pre-stored sizes for each stent part number based on the dimension from the proximal end to the distal end of the landing zone shown on the screen 400 of FIG. 9.
  • a plaque detection model 32M is used that is trained to output, when an IVUS tomographic image I11 is input, whether or not lipid plaque is present, and, if present, its location.
  • the configuration of the imaging diagnostic device 100 of the second embodiment is similar to that of the imaging diagnostic device 100 of the first embodiment, except for the plaque detection model 32M stored in the image processing device 3 and the details of the processing by the processing unit 30, which are described below. Therefore, among the configurations of the imaging diagnostic device 100 of the second embodiment, the configurations common to the imaging diagnostic device 100 of the first embodiment are given the same reference numerals and detailed descriptions are omitted.
  • FIG. 10 is a block diagram showing the configuration of an image processing device 3 of the second embodiment.
  • a plaque detection model 32M is stored in the storage unit 31 of the image processing device 32.
  • the plaque detection model 32M may be a copy of the plaque detection model 92M stored in a non-temporary storage medium 9 outside the device, read out via the input/output I/F 32.
  • the plaque detection model 32M may be a model distributed by a remote server device, acquired by the image processing device 3 via a communication unit (not shown), and stored in the storage unit 31.
  • the plaque detection model 32M is a model that outputs the probability that lipid plaque is present in the input image when an IVUS tomographic image I11 or a rectangular image I01 is input.
  • the probability may be output as a value close to "1" if even one area is present in the entire input image, or may be output for each angle from a reference in the radial direction by dividing the accuracy into radial directions corresponding to the scanning signal in the input image.
  • the accuracy corresponding to each angle information such as the 12 o'clock angle (angle from a reference line extending upward from the image center or blood vessel center is zero degrees) and the 2 o'clock angle (angle from the same reference line) in the tomographic image I11 is output.
  • the plaque detection model 32M may output the result of identifying the area in the input image in which lipid plaque is present, similar to the segmentation model 31M.
  • the plaque detection model 32M is a model using a neural network including an input layer 321, an intermediate layer 322, and an output layer 323.
  • the input layer 321 inputs a two-dimensional signal distribution, i.e., image data.
  • the output layer 323 outputs the probability that lipid plaque is present.
  • the output layer 323 may output the probability that lipid plaque is present in the form of an array of 180 values, for example, 0°, 2°, 4°, ..., 356°, 358°, in increments of 2°.
  • the processing unit 30 can input the IVUS tomographic image I11, which is easy to detect lipid plaque, to the input layer 321, or input the rectangular image I01 to the input layer 321, and obtain the output probability.
  • the processing unit 30 can obtain an array of the probability that lipid plaque exists at each angle output from the plaque detection model 32M, and obtain the continuous portion where the probability is equal to or greater than a predetermined value as the angle range of lipid plaque.
  • the plaque detection model 32M is created in advance by the image processing device 3 or another processing device and is considered to have been trained.
  • the teacher data is an IVUS tomographic image I11 or a rectangular image I01 with annotations.
  • the annotation is data indicating the presence or absence of lipid plaque (for example, the probability of presence is "1" and the probability of absence is "0").
  • the plaque detection model 32M is trained using the IVUS tomographic image I11 or rectangular image I01, in which the presence or absence of lipid plaque has been determined, as teacher data. If the plaque detection model 32M is a model of the type that outputs the probability that lipid plaque exists for each angle, the annotation is an array of data indicating the presence or absence of lipid plaque for each angle.
  • the annotation is created to indicate the presence or absence of lipid plaque for each angle based on the IVUS tomographic image I11 or rectangular image I01, in which the location of the lipid plaque in the image is known. For example, the presence or absence of lipid plaque for each created angle is added as an annotation to the IVUS tomographic image I11 or rectangular image I01 in an order of angle, thereby creating training data.
  • FIGS. 12 and 13 are flowcharts showing an example of an information processing procedure by the image processing device 3 of the second embodiment. Among the processing procedures shown in Figs. 12 and 13, the same step numbers are used for the steps common to the processing procedures shown in the flowcharts of Figs. 5 and 6 of the first embodiment, and detailed descriptions thereof will be omitted.
  • the processing unit 30 stores data indicating anatomical features calculated based on the range identification performed on each of the IVUS tomographic image I11 and the OCT tomographic image I21 (S111), and then performs the following processing.
  • the processing unit 30 inputs the IVUS tomographic image I11 or rectangular image I01 to the plaque detection model 32M (step S121).
  • the processing unit 30 determines whether or not lipid plaque is present based on the accuracy information output from the plaque detection model 32M (step S122).
  • the processing unit 30 may determine that lipid plaque is present when lipid plaque is present continuously for a length greater than or equal to a threshold in the longitudinal direction, including the previous and following tomographic images I11 or rectangular images I01.
  • the processing unit 30 may determine the range on the tomographic image I11 or the angle range.
  • the processing unit 30 stores the presence or absence of lipid plaque (or the identified range) determined in step S122 in association with the position on the long axis of the blood vessel (step S123), and proceeds to step S112.
  • the processing unit 30 identifies the location and range of the lesion on the long axis in step S114 (S114), and then identifies the location of lipid plaque on the long axis of the blood vessel based on the information stored in step S123 (step S124).
  • step S117 the processing unit 30 displays a graphic showing the location and range of the lesion (lipid plaque, fibrous plaque, calcified plaque, etc.) identified in step S114 and the location of the lipid plaque identified in step S124 (S117), and ends the process.
  • the presence or absence of lipid plaque is easier to determine with IVUS than with OCT because it can observe the inner part of the blood vessel membrane (the adventitia side), but identifying the area is difficult due to the softness of lipid plaque. For this reason, in addition to identifying the area using both IVUS and OCT in the segmentation model 31M, the image processing device 3 narrows down to lipid plaque and determines the presence or absence at each position using a plaque detection model 32M that uses a neural network for the image. This improves the accuracy of recognition of the area of lipid plaque that is recommended to be avoided as a location for stent placement. It is possible to visually confirm the area of lipid plaque with high accuracy, and also to determine a reference area that avoids the location of the lipid plaque.
  • the processing unit 30 of the image processing device 3 may use not only the plaque detection model 32M for detecting lipid plaques described above, but also a learning model for detecting whether or not a side branch is shown in an IVUS tomographic image I11 or an OCT tomographic image I21 when the image is input.
  • the processing unit 30 uses the learning model to determine whether or not a side branch is shown in the tomographic images I11 and I21 corresponding to the positions on the long axis of the blood vessel, and stores the location where the side branch is shown.
  • the processing unit 30 may calculate the size of the side branch. The position on the long axis where the side branch is present and its size are displayed on the graph 404 on the screen 400 as shown in FIG. 9 or in the vicinity of the graph 404.
  • FIG. 14 shows an example of a screen 400 displayed on the display device 4.
  • components of the screen 400 shown in FIG. 14 are given the same reference numerals and detailed descriptions are omitted.
  • a black diamond mark 410 indicating the location of the side branch is displayed. Furthermore, a numerical value indicating the size (diameter) of the side branch is displayed near the mark 410. This allows the examination operator or medical provider visually viewing the screen 400 to determine the location for placing a stent while recognizing the presence or absence of a side branch and its size.
  • the image processing device 3 uses a learning model that outputs appropriate stent information when it inputs a tomographic image I11, I12, or I3 obtained by a single imaging session using the catheter 1.
  • the configuration of the imaging diagnostic device 100 of the third embodiment is the same as that of the imaging diagnostic device 100 of the first embodiment, except for the stent information model 33M stored in the image processing device 3 and the details of the processing by the processing unit 30, which are described below. Therefore, among the configurations of the imaging diagnostic device 100 of the third embodiment, the configurations common to the imaging diagnostic device 100 of the first embodiment are given the same reference numerals and detailed descriptions are omitted.
  • FIG. 15 is a block diagram showing the configuration of an image processing device 3 of the third embodiment.
  • a stent information model 33M is stored in the storage unit 31 of the image processing device 32.
  • the stent information model 33M may be a copy of a stent information model 93M stored in a non-temporary storage medium 9 outside the device read out via the input/output I/F 32.
  • the stent information model 33M may be a model distributed by a remote server device, acquired by the image processing device 3 via a communication unit (not shown), and stored in the storage unit 31.
  • FIG. 16 is a schematic diagram of the stent information model 33M.
  • the stent information model 33M is a learning model using a neural network equipped with an input layer 331, an intermediate layer 332, and an output layer 333.
  • the input layer 331 inputs image data that visualizes a graph showing the distribution of plaque burden in the longitudinal direction, and image data that visualizes a graph showing the distribution of average lumen diameter in the longitudinal direction.
  • a graphic 406 indicating the presence of lipid plaque and a graphic 407 indicating the presence of fibrous plaque or calcified plaque may be superimposed on the graph.
  • the output layer 333 outputs an array of the qualifications of multiple types of stents corresponding to the identification data of each of the multiple types of stents.
  • the input layer 331 may input not only image data, but also a group of values indicating the distribution of plaque burden in the longitudinal direction (a group of values of plaque burden for positions on the longitudinal axis) and a group of values indicating the distribution of average lumen diameter in the longitudinal direction (a group of values of average lumen diameter for positions on the longitudinal axis).
  • the stent information model 33M will be described below as being created in advance by the image processing device 3, but it may also be created in advance by another processing device and considered to have been trained.
  • FIG. 17 is a flowchart showing an example of the process for generating a stent information model 33M.
  • the processing unit 30 reads out the distribution of plaque burden in the longitudinal direction and the distribution of average lumen diameter in the longitudinal direction that were stored in a single diagnosis of a blood vessel using the catheter 1 in the past (step S301).
  • the processing unit 30 reads out the positions and ranges on the longitudinal axis of lipid plaque, fibrous plaque, or calcified plaque that were stored in the single diagnosis using the catheter 1 (step S302).
  • the processing unit 30 creates an image in which a graphic showing the location of the lipid plaque, fibrous plaque, or calcified plaque identified in step S302 is superimposed on the distribution obtained in step S301 (step S303).
  • the processing unit 30 identifies the identification data of the stent used based on the diagnosis using the catheter 1 from the procedure record (step S304).
  • the processing unit 30 stores, as training data, a set of an image of the graph showing the distribution of plaque burden in the longitudinal direction, an image of the graph showing the distribution of average lumen diameter in the longitudinal direction, and the identification data of the stent identified in step S304 (step S305).
  • step S303 If a group of values indicating the distribution of plaque burden in the longitudinal direction and a group of values indicating the distribution of average lumen diameter in the longitudinal direction are input to the stent information model 33M instead of image data, step S303 is omitted. In this case, in step S305, the processing unit 30 stores pairs of each group of values and the stent identification data as training data.
  • the processing unit 30 inputs an image of the graph of the stored training data to the input layer 331 of the stent information model 33M before learning is completed (step S306).
  • the processing unit 30 calculates a loss using the eligibility of each stent identification data output from the output layer 333 of the stent information model 33M and the eligibility of the stent corresponding to the input image, and thereby learns (updates) the parameters of the intermediate layer 332 (step S307).
  • the processing unit 30 determines whether the learning conditions are met (step S308), and if it is determined that the learning conditions are not met (S308: NO), the processing unit 30 returns to step S306 and continues learning.
  • the processing unit 30 stores the descriptive data indicating the network configuration and conditions of the stent information model 33M and the parameters of the intermediate layer 332 in the storage unit 31 or another storage medium (step S309), and ends the model generation process. Note that the processing unit 30 may execute the processes of steps S301-S305 in advance, and then execute the processes of steps S306-S309 for the collected teacher data.
  • the stent information model 33M is generated to output appropriate stent information when a data group showing anatomical features obtained by scanning with the diagnostic catheter 1 (or image data that visualizes the data group) is input.
  • the processing unit 30 can input image data of an image of a graph of the average lumen diameter and image data of an image of a graph of the plaque burden into the stent information model 33M, and obtain the output array of eligibility.
  • the processing unit 30 can create suggested information for a stent to be placed based on the identification data of stents whose eligibility is equal to or greater than a predetermined value.
  • FIGS. 18 and 19 are flowcharts showing an example of an information processing procedure by the image processing device 3 of the third embodiment. Among the processing procedures shown in Figs. 18 and 19, the same step numbers are used for the steps common to the processing procedures shown in the flowcharts of Figs. 5 and 6 of the first embodiment, and detailed descriptions thereof will be omitted.
  • the processing unit 30 determines that scanning of blood vessels is complete (S112: YES) and outputs a graph showing anatomical features and a graphic showing the position, range, and reference area of the lesion (S117), it executes the following process.
  • the processing unit 30 creates an image in which a graphic showing the location of lipid plaque, fibrous plaque, or calcified plaque output in step S117 is superimposed on a graph showing anatomical features (distribution of plaque burden on the long axis and distribution of average lumen diameter on the long axis) (step S131).
  • the processing unit 30 inputs the created image into the stent information model 33M (step S132).
  • the processing unit 30 extracts identification data of stents whose qualifications are equal to or greater than a predetermined value based on the sequence of qualifications output from the stent information model 33M (step S133).
  • the processing unit 30 outputs information such as the product number and size of the stent identified by the identification data extracted in step S133 to the screen displayed on the display device 4 (step S134), and ends the process.
  • FIG. 20 shows an example of a screen 400 displayed on the display device 4.
  • the components common to the screen 400 shown in FIG. 7 of the first embodiment are given the same reference numerals and detailed description is omitted.
  • the screen 400 in FIG. 20 displays a text box 411 that contains information about the recommended stent. This allows the examination operator or medical provider viewing the screen 400 to refer to it when selecting a stent to place.
  • the image processing device 3 has been described as outputting information suggesting a stent, but this is not limited thereto.
  • the image processing device 3 may also learn a model using a neural network to output information suggesting an appropriate balloon, and output new suggestions using the learned model.
  • Image diagnostic device 3 Image processing device 30 Processing unit 31 Storage unit 31M Segmentation model 33M Stent information model 4 Display device 5 Input device

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Pathology (AREA)
  • Optics & Photonics (AREA)
  • Robotics (AREA)
  • Endoscopes (AREA)

Abstract

L'invention concerne un programme informatique, un procédé de traitement d'informations, un dispositif de traitement d'informations et un modèle d'apprentissage qui permettent d'afficher des informations appropriées requises pour une détermination par rapport à une image médicale. Le programme informatique amène un ordinateur à exécuter les processus consistant à : calculer des données, qui indiquent les caractéristiques anatomiques d'un organe luminal, sur la base de signaux émis par un dispositif d'imagerie disposé dans un cathéter inséré dans l'organe luminal ; identifier une plage d'une partie de lésion dans l'organe luminal sur la base des données qui indiquent les caractéristiques anatomiques calculées ; et délivrer en sortie des informations pour retenir une endoprothèse dans l'organe luminal sur la base de la plage identifiée d'une partie de lésion.
PCT/JP2023/035280 2022-09-29 2023-09-27 Programme informatique, procédé de traitement d'informations, dispositif de traitement d'informations et modèle d'apprentissage WO2024071251A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-156647 2022-09-29
JP2022156647 2022-09-29

Publications (1)

Publication Number Publication Date
WO2024071251A1 true WO2024071251A1 (fr) 2024-04-04

Family

ID=90477969

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/035280 WO2024071251A1 (fr) 2022-09-29 2023-09-27 Programme informatique, procédé de traitement d'informations, dispositif de traitement d'informations et modèle d'apprentissage

Country Status (1)

Country Link
WO (1) WO2024071251A1 (fr)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008512171A (ja) * 2004-09-09 2008-04-24 メディガイド リミテッド 内腔内の選択された位置へ医療用デバイスを移送するための方法およびシステム
JP2012200532A (ja) * 2011-03-28 2012-10-22 Terumo Corp 画像診断装置及び表示方法
JP2013056113A (ja) * 2011-09-09 2013-03-28 Toshiba Corp 画像表示装置
JP2017534394A (ja) * 2014-11-14 2017-11-24 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. 経皮的冠動脈インターベンション計画インタフェース、並びに関連するデバイス、システム、及び方法
JP2019217263A (ja) * 2018-05-03 2019-12-26 キヤノン ユーエスエイ, インコーポレイテッドCanon U.S.A., Inc マルチプルイメージングモダリティにわたって関心領域を強調するためのデバイス、システム、および方法
JP2020503909A (ja) * 2016-09-28 2020-02-06 ライトラボ・イメージング・インコーポレーテッド ステント計画システム及び血管表現を使用する方法
JP2021517034A (ja) * 2018-03-15 2021-07-15 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. 管腔内病巣評価及び処置計画のための解剖学的標識の決定及び可視化
WO2022054805A1 (fr) * 2020-09-14 2022-03-17 テルモ株式会社 Dispositif de traitement d'informations, système de traitement d'informations, procédé de traitement d'informations et programme informatique
WO2022071181A1 (fr) * 2020-09-29 2022-04-07 テルモ株式会社 Dispositif de traitement d'information, procédé de traitement d'information, programme, et procédé de génération de modèle
WO2022071121A1 (fr) * 2020-09-29 2022-04-07 テルモ株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP2022079550A (ja) * 2020-06-29 2022-05-26 ライトラボ・イメージング・インコーポレーテッド プロセッサ装置の作動方法

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008512171A (ja) * 2004-09-09 2008-04-24 メディガイド リミテッド 内腔内の選択された位置へ医療用デバイスを移送するための方法およびシステム
JP2012200532A (ja) * 2011-03-28 2012-10-22 Terumo Corp 画像診断装置及び表示方法
JP2013056113A (ja) * 2011-09-09 2013-03-28 Toshiba Corp 画像表示装置
JP2017534394A (ja) * 2014-11-14 2017-11-24 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. 経皮的冠動脈インターベンション計画インタフェース、並びに関連するデバイス、システム、及び方法
JP2020503909A (ja) * 2016-09-28 2020-02-06 ライトラボ・イメージング・インコーポレーテッド ステント計画システム及び血管表現を使用する方法
JP2021517034A (ja) * 2018-03-15 2021-07-15 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. 管腔内病巣評価及び処置計画のための解剖学的標識の決定及び可視化
JP2019217263A (ja) * 2018-05-03 2019-12-26 キヤノン ユーエスエイ, インコーポレイテッドCanon U.S.A., Inc マルチプルイメージングモダリティにわたって関心領域を強調するためのデバイス、システム、および方法
JP2022079550A (ja) * 2020-06-29 2022-05-26 ライトラボ・イメージング・インコーポレーテッド プロセッサ装置の作動方法
WO2022054805A1 (fr) * 2020-09-14 2022-03-17 テルモ株式会社 Dispositif de traitement d'informations, système de traitement d'informations, procédé de traitement d'informations et programme informatique
WO2022071181A1 (fr) * 2020-09-29 2022-04-07 テルモ株式会社 Dispositif de traitement d'information, procédé de traitement d'information, programme, et procédé de génération de modèle
WO2022071121A1 (fr) * 2020-09-29 2022-04-07 テルモ株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Similar Documents

Publication Publication Date Title
US11741613B2 (en) Systems and methods for classification of arterial image regions and features thereof
CN112512438A (zh) 用于利用医学成像在管腔评估中显示多幅管腔内图像的系统、设备和方法
WO2021199968A1 (fr) Programme informatique, procédé de traitement d'informations, dispositif de traitement d'informations et procédé de génération de modèle
WO2022071264A1 (fr) Programme, procédé de génération de modèle, dispositif de traitement d'informations et procédé de traitement d'informations
JP6170565B2 (ja) 画像診断装置及びその作動方法
JP7489882B2 (ja) コンピュータプログラム、画像処理方法及び画像処理装置
WO2023054467A1 (fr) Procédé de génération de modèle, modèle d'apprentissage, programme informatique, procédé de traitement d'informations et dispositif de traitement d'informations
WO2024071251A1 (fr) Programme informatique, procédé de traitement d'informations, dispositif de traitement d'informations et modèle d'apprentissage
WO2022071265A1 (fr) Programme, et dispositif et procédé de traitement d'informations
US20220218309A1 (en) Diagnostic assistance device, diagnostic assistance system, and diagnostic assistance method
WO2024071121A1 (fr) Programme informatique, procédé de traitement d'informations et dispositif de traitement d'informations
JP2024050056A (ja) コンピュータプログラム、学習モデル、情報処理方法、及び情報処理装置
WO2024071322A1 (fr) Procédé de traitement d'informations, procédé de génération de modèle d'apprentissage, programme informatique et dispositif de traitement d'informations
US20240013386A1 (en) Medical system, method for processing medical image, and medical image processing apparatus
WO2021199961A1 (fr) Programme informatique, procédé de traitement d'informations, et dispositif de traitement d'informations
JP2023051175A (ja) コンピュータプログラム、情報処理方法、及び情報処理装置
WO2021199966A1 (fr) Programme, procédé de traitement d'informations, procédé de génération de modèle d'apprentissage, procédé de réapprentissage pour modèle d'apprentissage, et système de traitement d'informations
US20240008849A1 (en) Medical system, method for processing medical image, and medical image processing apparatus
US20220028079A1 (en) Diagnosis support device, diagnosis support system, and diagnosis support method
WO2021199967A1 (fr) Programme, procédé de traitement d'informations, procédé de génération de modèle d'apprentissage, procédé de réapprentissage de modèle d'apprentissage, et système de traitement d'informations
WO2022209652A1 (fr) Programme informatique, procédé de traitement d'informations et dispositif de traitement d'informations
WO2021193018A1 (fr) Programme, procédé de traitement d'informations, dispositif de traitement d'informations et procédé de génération de modèle
WO2022202323A1 (fr) Programme, procédé de traitement d'informations et dispositif de traitement d'informations
WO2023189260A1 (fr) Programme informatique, dispositif de traitement d'informations et procédé de traitement d'informations
WO2023054442A1 (fr) Programme informatique, dispositif de traitement d'informations, et procédé de traitement d'informations

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23872473

Country of ref document: EP

Kind code of ref document: A1