WO2023054467A1 - Procédé de génération de modèle, modèle d'apprentissage, programme informatique, procédé de traitement d'informations et dispositif de traitement d'informations - Google Patents

Procédé de génération de modèle, modèle d'apprentissage, programme informatique, procédé de traitement d'informations et dispositif de traitement d'informations Download PDF

Info

Publication number
WO2023054467A1
WO2023054467A1 PCT/JP2022/036157 JP2022036157W WO2023054467A1 WO 2023054467 A1 WO2023054467 A1 WO 2023054467A1 JP 2022036157 W JP2022036157 W JP 2022036157W WO 2023054467 A1 WO2023054467 A1 WO 2023054467A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
model
hollow organ
lesion
catheter
Prior art date
Application number
PCT/JP2022/036157
Other languages
English (en)
Japanese (ja)
Inventor
耕太郎 楠
雄紀 坂口
Original Assignee
テルモ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by テルモ株式会社 filed Critical テルモ株式会社
Publication of WO2023054467A1 publication Critical patent/WO2023054467A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/313Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/12Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters

Definitions

  • the present invention provides a model generation method for a model used to capture anatomical features of a hollow organ from a medical image of the hollow organ, a learned learning model, a computer program using the learning model, an information processing method, and information. It relates to processing equipment.
  • PCI percutaneous coronary intervention
  • angiography which takes pictures from outside the body using a contrast agent
  • OCT Optical Coherence Tomography
  • OFDI Optical Frequency Domain Imaging
  • the purpose of the present disclosure is to use a model generation method, a trained learning model, and a learning model for outputting angle data for capturing anatomical features of a hollow organ based on an image obtained using a catheter.
  • An object of the present invention is to provide a computer program, an information processing method, and an information processing apparatus.
  • a computer generates scanning signals from an imaging device provided in a catheter that is inserted into a hollow organ and moves in the length direction while rotating about the length direction of the hollow organ.
  • a learning model is generated for outputting the presence probability of a lesion or a medical instrument for each rotational angle of the catheter when the image of the hollow organ based on the scanning signal is input.
  • the learning model according to the present disclosure is input with an image based on scanning signals from an imaging device provided in a catheter that is inserted into a hollow organ and moves in the length direction while rotating about the length direction of the hollow organ.
  • an output layer that outputs the existence probability of a lesion or a medical device for each rotation angle of the catheter; the image; and data indicating the presence or absence of a lesion or a medical device for each angle in the image.
  • an intermediate layer learned based on teacher data comprising: providing an image based on a scanning signal from the imaging device to the input layer; performing operations based on the intermediate layer;
  • a computer is operated to output probabilities from the output layer.
  • a computer program is a computer that acquires scanning signals from an imaging device provided in a catheter that is inserted into a hollow organ and moves in the length direction while rotating about the length direction of the hollow organ.
  • a first model that, when an image of the hollow organ based on the scanning signal is input, outputs data identified in different regions including a lumen and a membrane of the hollow organ in the image;
  • a second model that, when input, outputs a probability of existence of a lesion or medical device for each rotation angle of the catheter of the imaging device;
  • An information processing method is a computer that acquires a scanning signal by an imaging device provided in a catheter that is inserted into a hollow organ and moves in the length direction while rotating about the length direction of the hollow organ.
  • a first model that, when an image of the hollow organ based on the scanning signal is input, outputs data identified in different regions including a lumen and a membrane of the hollow organ in the image; and a second model that outputs the existence probability of a lesion or a medical device for each rotation angle of the catheter of the imaging device when is input, and based on the region identified in the first model, the Based on the parameters indicating the anatomical features of the hollow organ and the existence probability for each rotation angle obtained from the second model, an angular range for catching the lesion or the medical instrument from the center of the hollow organ is calculated.
  • An information processing apparatus acquires a scanning signal by an imaging device provided in a catheter that is inserted into a hollow organ and moves in the length direction while rotating about the length direction of the hollow organ.
  • a first model for outputting data identified in different regions including a lumen and a membrane of the hollow organ in the image when an image of the hollow organ based on the scanning signal is input in the processing device;
  • a storage unit that stores a second model that outputs a probability of existence of a lesion or a medical device for each rotational angle of the catheter of the imaging device when the image is input; and image processing based on the scanning signal.
  • processing unit includes parameters representing anatomical features of the hollow organ based on the region identified in the first model, and for each rotation angle obtained from the second model and the existence probability of , the angular range that captures the lesion or the medical instrument from the center of the hollow organ is calculated.
  • FIG. 4 is an explanatory diagram showing the operation of the catheter; 1 is a block diagram showing the configuration of an image processing device; FIG. FIG. 4 is a schematic diagram of a learned first model; It is a figure which shows the detected boundary (contour). It is a schematic diagram of a second model. It is a figure which shows the outline
  • FIG. 10 is a diagram showing the deviation between the center of the catheter and the center of the blood vessel (the center of gravity of the cross section); 4 is a flow chart showing an example of a processing procedure by a processing unit of the image processing apparatus; 4 is a flow chart showing an example of a processing procedure by a processing unit of the image processing apparatus; 9 is a flowchart showing an example of an angle range calculation processing procedure; FIG. 11 is an explanatory diagram of a calculation process of an angle range; FIG. 11 is an explanatory diagram of a calculation process of an angle range; FIG. 10 is an explanatory diagram of correction processing of a lumen boundary; An example of a screen displayed on a display device is shown.
  • FIG. 10 is a diagram showing another screen example on the display device; An example of calculating an angle for capturing a lesion that occurs only on the surface of the lumen will be shown. An example of calculating an angle to capture calcified plaque is shown. An example of calculating an angle to capture attenuating plaque is shown. An example of calculating an angle at which a stent placed in a blood vessel is calculated is shown. An example of calculating an angle to catch a guide wire inserted with a catheter is shown.
  • FIG. 1 is a diagram showing a configuration example of the diagnostic imaging apparatus 100.
  • the diagnostic imaging apparatus 100 is an apparatus that generates medical images including ultrasonic tomographic images of blood vessels (lumen organs) by the IVUS method, and is used for intravascular ultrasonic examination and diagnosis.
  • the diagnostic imaging apparatus 100 includes a catheter 1 , an MDU (Motor Drive Unit) 2 , an image processing device (information processing device) 3 , a display device 4 and an input device 5 .
  • the catheter 1 is a flexible tube for medical use.
  • the catheter 1 is particularly called an imaging catheter which has an imaging device 11 at its distal end and rotates in the circumferential direction by being driven from its proximal end.
  • the imaging device 11 is an ultrasound probe including an ultrasound transducer and an ultrasound sensor for IVUS methods.
  • OCT it is an OCT device including a near-infrared laser, a near-infrared sensor, and the like.
  • Other devices that use electromagnetic waves of other wavelengths, such as visible light may be used as the imaging device 11 .
  • the MDU 2 is a driving device attached to the proximal end of the catheter 1, and controls the operation of the catheter 1 by driving the internal motor according to the operation of the medical staff.
  • the image processing device 3 generates a plurality of medical images such as tomographic images of blood vessels based on the signals output from the imaging device 11 of the catheter 1 .
  • the details of the configuration of the image processing device 3 will be described later.
  • the display device 4 uses a liquid crystal display panel, an organic EL display panel, or the like.
  • the display device 4 displays medical images generated by the image processing device 3 and information about the medical images.
  • the input device 5 is an input interface that receives operations on the image processing device 3 .
  • the input device 5 may be a keyboard, a mouse, or the like, or may be a touch panel, soft keys, hard keys, or the like built into the display device 4 .
  • FIG. 2 is an explanatory diagram showing the operation of the catheter 1.
  • a catheter 1 is inserted into a tubular blood vessel L by a medical practitioner along a guide wire W inserted into a coronary artery shown in the figure.
  • the right part corresponds to the distal part from the insertion point of the catheter 1 and the guide wire W
  • the left part corresponds to the proximal part.
  • the catheter 1 By driving the MDU 2, the catheter 1 rotates about its longitudinal axis while moving from distal to proximal within the blood vessel L, as indicated by the arrow in the figure. Therefore, the imaging device 11 scans the inside of the blood vessel L spirally.
  • the image processing apparatus 3 acquires the signal for each scan output from the imaging device 11 of the catheter 1 .
  • the imaging device 11 radially emits detection waves and detects the reflected waves.
  • the imaging device 11 performs this scanning tens to thousands of times for each angle while rotating 360 degrees.
  • the image processing device 3 can acquire the distribution of the detected reflected waves in the radial direction for each angle.
  • the image processing device 3 converts a rectangular image (I0 in FIG. 2) in which signals for each scan are aligned in the radial direction and arranged in a rectangular shape, to a tomographic image obtained by polar coordinate transformation (inverse transformation) every 360 degrees.
  • An image (cross-sectional image) is generated (I1 in FIG. 2).
  • the tomographic image I1 is also called a frame image.
  • the reference point (center) of the tomographic image I1 corresponds to the area of the catheter 1 (not imaged).
  • the image processing device 3 can output information including an image showing the structure of the blood vessel based on the obtained tomographic image I1 so that the medical staff can visually recognize the information.
  • a lesion existing in a blood vessel or a catheter for a medical device such as a stent can be positioned.
  • Output the range (angle range) when viewed from a part of the side. The angle range and output method will be described in detail below.
  • FIG. 3 is a block diagram showing the configuration of the image processing device 3. As shown in FIG.
  • the image processing device 3 is a computer and includes a processing section 30 , a storage section 31 and an input/output I/F 32 .
  • the processing unit 30 includes one or more CPU (Central Processing Unit), MPU (Micro-Processing Unit), GPU (Graphics Processing Unit), GPGPU (General-purpose computing on graphics processing units), TPU (Tensor Processing Unit), etc. including.
  • the processing unit 30 incorporates a non-temporary storage medium such as a RAM (Random Access Memory), stores data generated during processing in the non-temporary storage medium, and executes the computer program 3P stored in the storage unit 31. perform an operation.
  • a non-temporary storage medium such as a RAM (Random Access Memory)
  • the storage unit 31 is a nonvolatile storage medium such as a hard disk or flash memory.
  • the storage unit 31 stores the computer program 3P read by the processing unit 30, setting data, and the like.
  • the storage unit 31 also stores a learned first model 31M and a second model 32M.
  • the computer program 3P, the first model 31M and the second model 32M store the computer program 9P, the first model 91M and the second model 92M stored in the non-temporary storage medium 9 outside the device. It may be read out via F32 and duplicated.
  • the computer program 3P, the learned first model 31M and the learned second model 32M are obtained by the image processing device 3 via a communication unit (not shown) from a remote server device and stored in the storage unit 31. There may be.
  • the input/output I/F 32 is an interface to which the catheter 1, the display device 4 and the input device 5 are connected.
  • the processing unit 30 acquires signals (digital data) output from the imaging device 11 via the input/output I/F 32 .
  • the processing unit 30 outputs screen data of a screen including the generated tomographic image I1 and/or the longitudinal image I2 to the display device 4 via the input/output I/F 32 .
  • the processing unit 30 receives operation information input to the input device 5 via the input/output I/F 32 .
  • FIG. 4 is a schematic diagram of the learned first model 31M.
  • the first model 31M is learned to output an image showing the region of one or more objects appearing in the tomographic image I1 when the tomographic image I1 obtained by converting the scanning signal into polar coordinates is input. model.
  • the first model 31M is, for example, a model that implements semantic segmentation.
  • the first model 31M is trained to output a tag image IS in which each pixel of the input tomographic image I1 is tagged to indicate which object range each pixel belongs to. .
  • the first model 31M uses a so-called U-net in which a convolutional layer, a pooling layer, an upsampling layer, and a softmax layer are arranged symmetrically.
  • the first model 31M is a model that outputs a tag image IS indicating the range identified in the image when the tomographic image I1 is input.
  • the tag image IS to be output includes the range of the lumen of the blood vessel, the range of the membrane including the media of the blood vessel corresponding to the boundary between the lumen and the boundary of the blood vessel, the range of the guide wire W and its reflection, Areas corresponding to the catheter 1, and lesions (e.g., plaques, calcifications), etc., are tagged with different pixel values (indicated by different types of hatching and solid colors in FIG. 4) for the pixels at those locations. It is.
  • a lesion area may include an area of an artificial object such as a stent, in particular a medical device.
  • the first model 31M exemplifies semantic segmentation and U-net as described above, it is of course not limited to this.
  • the first model 31M may be a model that implements individual recognition processing by instance segmentation or the like.
  • the first model 31M is not limited to the U-net base, and may use a model based on SegNet, R-CNN, or an integrated model with other edge extraction processing.
  • the processing unit 30 uses the pixel values in the tag image IS obtained by inputting the tomographic image I1 to the first model 31M and the coordinates of the pixels in the image to determine the lumen boundary of the blood vessel inspected using the catheter 1, and vessel boundaries can be edge detected.
  • the blood vessel boundary is the external elastic membrane (EEM) between the media and the adventitia of the blood vessel, and it appears relatively clearly with low brightness in the tomographic image I1 by the IVUS method.
  • FIG. 5 is a diagram showing detected boundaries (contours).
  • FIG. 5 shows a superimposed display of a curve B1 indicating the lumen boundary and a curve B2 indicating the blood vessel boundary obtained based on the tag image IS output from the first model 31M with respect to the tomographic image I1 shown in FIG. Indicates the state of being
  • the image processing device 3 in the present embodiment stores and uses the second model 32M used together with the first model 31M in the storage unit 31.
  • the second model 32M is a model that outputs the presence or absence of a lesion or a medical instrument for each scanning signal in association with angle information.
  • FIG. 6 is a schematic diagram of the second model 32M.
  • the second model 32M is a model using a neural network comprising an input layer 321, an intermediate layer 322 and an output layer 323.
  • the input layer 321 inputs two-dimensional signal distribution, that is, image data.
  • the output layer 323 outputs the probability that a lesion or medical device exists for each angle data (eg, 1° to 360°).
  • the output layer 323 outputs an array of probabilities, 180 of 0°, 2°, 4°, ..., 356°, 358° every 2°.
  • the number is not limited to 180, but may be, for example, 360, 120 at intervals of 3°, 90 at intervals of 4°, or more.
  • the processing unit 30 can input the tomographic image I1 to the input layer 321, or input the rectangular image I0 to the input layer 321, and obtain an array of output probabilities.
  • the processing unit 30 obtains an array of probabilities of existence of a lesion or a medical device at each angle output from the second model 32M, and treats a portion where the probability is equal to or greater than a predetermined value as a lesion. It can be obtained as an angular range of a part or medical device. It should be noted that the probability of existence of a lesion or medical device for each angle output from the second model 32M is the probability for each angle with the imaging device 11 as a reference.
  • the second model 32M is created in advance by the image processing device 3 or another processing device and is already trained.
  • FIG. 7 is a diagram showing an outline of learning of the second model 32M. In the following description, it is assumed that the image processing device 3 executes the learning process in advance, but this is only an example and the present invention is not limited to this. Other processors may pre-learn.
  • the teacher data is the annotated tomographic image I1 or rectangular image I0.
  • the annotation is the existence probability for each angle with respect to the tomographic image I1 or rectangular image I0 in which the presence or absence of a lesion or a medical instrument and the location in the image are known.
  • the teacher data may be prepared for each type of lesion or each type of medical device.
  • the second model 32M may be learned separately with different teacher data depending on whether it is calcified plaque, attenuating plaque, stent, or guidewire.
  • the presence or absence of a lesion or a medical device for each angle in the teacher data is indicated by marks placed at the same distance from the tomographic image I1.
  • the presence or absence of a lesion or a medical device is indicated by hatching marks.
  • the annotation is, for example, an array of probabilities associated with the tomographic image I1, and is assigned "1.0" if a lesion or medical instrument is present and "0.0" if not.
  • plaque exists in the range of 134° to 230°, and the existence probability in that angle range is "1.0".
  • the processing unit 30 of the image processing device 3 learns the second model 32M and generates a model based on the teacher data of images to which such known existence probabilities are attached.
  • FIG. 8 is a flow chart showing an example of the process of generating the second model 32M.
  • the processing unit 30 acquires the tomographic image I1 or the rectangular image I0 generated in the past based on the signal obtained from the imaging device 11 (step S201).
  • the processing unit 30 receives the presence or absence of a lesion or a medical instrument for each angle with reference to the image center (the center of the catheter 1) in the acquired image (step S202).
  • the acceptance method is that the processing unit 30 causes the display device 4 or the like to display the tomographic image I1 and the mark as shown in FIG. 1.0)” or “None (0.0)” may be controlled to change to a color, pattern, or the like.
  • the receiving method may be a method of receiving an array of "yes (1.0)” and "no (0.0)" for each angle.
  • the processing unit 30 stores teacher data including the image acquired in step S201 and data indicating the presence/absence of each angle received in step S202 for the image (step S203). It is desirable that the processing of steps S201-S203 be performed until as many data as possible are collected.
  • the tomographic image I1 or rectangular image I0 that serves as teacher data may be reduced before being used for learning.
  • a rectangular image I0 is input, it is desirable to learn using a rectangular image I0 of 540° obtained by adding 90° forward and backward to data of 360° (see FIG. 6). By inputting an image that maintains continuity with adjacent scanning signals, the accuracy of learning can be maintained.
  • the processing unit 30 inputs the stored teacher data image to the input layer 321 of the second model 32M before learning is completed (step S204).
  • the processing unit 30 calculates the loss using the existence probability for each angle output from the output layer 323 of the second model 32M and the data indicating the existence or non-existence for each accuracy corresponding to the input image.
  • the parameters of the intermediate layer 322 are learned (updated) (step S205).
  • the processing unit 30 determines whether or not the learning conditions are satisfied (step S206), and if it is determined that the learning conditions are not satisfied (S206: NO), the process returns to step S201 to continue learning.
  • the processing unit 30 stores the description data indicating the network configuration and conditions of the second model 32M and the parameters of the intermediate layer 322 in the storage unit 31 or another storage medium. (step S207), and terminates the model generation process. Note that the processing unit 30 may receive the processing of steps S201 to S203 in advance and execute the processing of steps S204 to S207 on the collected teacher data.
  • the second model 32M is generated so as to output a circumferential probability distribution indicating the presence or absence of a lesion or a medical instrument with respect to the angle with respect to the imaging device 11.
  • the probability indicating the presence/absence of a lesion or a medical device for each angle output from the second model 32M generated is the angle range in the circumferential direction that captures the lesion or the like with the catheter 1 as the center. It is a point.
  • the circumferential probability distribution output from the second model 32M is based on the center of the tomographic image I1, that is, the center of the catheter 1.
  • FIG. 9 is a diagram showing the deviation between the center of the catheter 1 and the center of the blood vessel (the center of gravity of the cross section).
  • the center of the tomographic image I1 indicated by symbol x one long side of the rectangular image I0 on the catheter 1 side
  • the data capturing the lesion centered on the catheter 1 is dissociated from angle information from the center of the blood vessel, which should be considered clinically.
  • the image processing apparatus 3 in the present embodiment uses the first model 31M to obtain an angle of capturing an object such as a lesion or a medical instrument centered on the catheter 1 derived based on the second model 32M based on the anatomical view of the blood vessel.
  • the data is recalculated using angles based on the features.
  • FIG. 10 and 11 are flowcharts showing an example of processing procedures by the processing unit 30 of the image processing device 3.
  • FIG. When a signal is output from the imaging device 11 of the catheter 1, the processing unit 30 of the image processing device 3 starts the following processing.
  • the processing unit 30 performs polar coordinate transformation (inverse transformation) on an image in which the signals in the radial direction are arranged in a rectangular shape. ) to generate a tomographic image I1 (step S102) (see FIG. 2).
  • the processing unit 30 outputs the generated tomographic image I1 so that it can be displayed in real time on the screen displayed on the display device 4 (step S103).
  • the processing unit 30 stores the signal data acquired in step S101 and the tomographic image I1 in the storage unit 31 in association with the position (long axis position, angle) of the imaging device 11 (step S104).
  • the processing unit 30 inputs the tomographic image I1 to the first model 31M (step S105). Based on the tag image IS obtained from the first model 31M, the processing unit 30 calculates data on the lumen boundary and blood vessel boundary in the tomographic image I1 (step S106). In step S106, the processing unit 30 converts the contour (edge) of the lumen range into the lumen boundary and the outer contour of the membrane range out of the lumen range and the membrane range including the media of the blood vessel output from the first model 31M. Calculated as blood vessel boundaries. In step S106, the processing unit 30 may reduce the size of the tomographic image I1 and then input the tomographic image I1 to the first model 31M to perform high-speed processing.
  • the processing unit 30 calculates the center of the blood vessel based on the data of the lumen boundary and blood vessel boundary calculated in step S106 (step S107).
  • the processing unit 30 extracts a circle along the boundary of the blood vessel boundary or the lumen boundary by, for example, Hough transform, and determines the center of the circle as the center of the blood vessel.
  • the processing unit 30 may obtain the center of gravity of the region inside the blood vessel boundary or the region inside the lumen boundary, and determine the center of gravity as the blood vessel center.
  • the processing unit 30 stores the calculated coordinates of the blood vessel center in the tomographic image I1 (step S108).
  • the processing unit 30 generates a rectangular image I0 (see FIGS. 2 and 9) obtained by adding the scanning result of 360° to the scanning result of 90° forward and backward from the tomographic image I1 or the signal data acquired in step S101. 2 input to the model 32M (step S109).
  • the processing unit 30 acquires the existence probability for each angle of the lesion or medical device obtained from the second model 32M (step S110).
  • the processing unit 30 determines the center of the blood vessel in the range where the lesion or medical device exists. (step S111). The processing in step S111 will be described later.
  • the processing unit 30 superimposes an image showing the angular range in which the lesion or the medical instrument exists as a result of the calculation process in step S111 on the tomographic image I1 (step S112).
  • the processing unit 30 stores the data of the angular range in which the lesion or the medical instrument exists in association with the tomographic image I1 stored in step S104 (step S113).
  • the processing unit 30 corrects the range of the lesion or the medical instrument identified based on the tag image IS obtained in step S105, based on the angular range calculated in step S111 (step S114). In step S114, the processing unit 30 may, for example, leave the range included in the angle range and remove other ranges as noise.
  • the processing unit 30 causes the display device 4 to display data indicating the position of the target tomographic image I1 in the longitudinal direction (step S115).
  • step S115 the processing unit 30 causes the position of the target tomographic image I1 to be displayed in real time during scanning. Further, in this case, when the target tomographic image I1 is associated with the angle range in which the lesion or the medical device exists by the processing in step S111, the processing unit 30 displays a mark or the like indicating it in the long axis direction. It is preferable to superimpose the display on the displayed image.
  • the processing unit 30 determines whether or not the scanning of the catheter 1 by the imaging device 11 has been completed (step S116). If it is determined that scanning has not been completed (S116: NO), the processing unit 30 returns the process to step S101 to generate the next tomographic image I1.
  • FIG. 12 is a flowchart showing an example of an angle range calculation processing procedure.
  • the processing procedure shown in the flowchart of FIG. 12 corresponds to the detailed procedure of step S111 in the flowcharts of FIGS.
  • the processing unit 30 calculates, from the existence probability for each angle obtained from the second model 32M, a range in which the existence probability is continuously equal to or higher than a predetermined probability value as a range in which the lesion or the medical device exists (step S301). ).
  • the processing unit 30 refers to the pixel values in the tomographic image I1, and extracts a portion with a higher (or lower) pixel value than the others between the blood vessel boundary and the lumen boundary (step S302). That is, in step S302, the processing unit 30 extracts a portion of the image that is brighter than the surroundings, or a portion that is darker than the surroundings. In step S302, the processing unit 30 may also extract a region inside the lumen boundary (because there are also lesions or medical instruments reaching the inside of the lumen). Further, the process of step S302 may be omitted because extraction is not possible depending on the type of lesion.
  • the processing unit 30 determines the angular range from the center of the tomographic image I1 (the center of the catheter 1) of the range in which the lesion or the medical device exists. is adjusted (step S303).
  • the processing unit 30 may exclude a portion outside the range calculated in step S301 from among the portions extracted in step S302. If the range calculated in step S301 and the portion extracted in step S302 overlap, the processing unit 30 may adjust the angle range so that the range becomes a wider range (OR range). , the angle range may be adjusted to be a narrower range (only the AND range).
  • the processing of steps S302 and S303 is not essential.
  • the processing unit 30 extracts from the center of the tomographic image I1 a straight line with a maximum angle, a straight line with a minimum angle, and the calculated lumen boundary (or blood vessel boundary) in the angular range in which the probability that a lesion or medical device exists is high. in the tomographic image I1 is calculated (step S304).
  • the processing unit 30 calculates and stores the angles of the straight lines connecting the coordinates of the two points calculated in step S301 and the coordinates of the center of the blood vessel (step S305). In step S305, the processing unit 30 calculates the angle between each straight line and the line segment directed upward in the vertical direction from the center of the tomographic image I1.
  • the processing unit 30 Based on the coordinates of the two points stored in step S305, the processing unit 30 connects the intersection points of the straight line and the lumen boundary or the blood vessel boundary with a spline curve, arc, or the like, and corrects the lumen boundary or the blood vessel boundary (step S306).
  • the processing unit 30 ends the calculation processing of the angle range based on the center of the blood vessel, and returns the processing to step S112 in the flowcharts of FIGS. 10 and 11 .
  • the processing unit 30 may execute the processing of step S302 after the processing of step S305 in the processing procedure shown in the flowchart of FIG.
  • the processing unit 30 executes processing as follows.
  • the processing unit 30 calculates the intersection of the two straight lines from the center of the tomographic image I1 to both ends of the angle range and the lumen boundary or the blood vessel boundary (S304), and calculates the straight line connecting the intersection and the blood vessel center.
  • An angle is calculated (S305).
  • the processing unit 30 selects a portion having a higher pixel value (whiter than the surrounding area) in a fan-shaped shape drawn by at least two straight lines having the angles calculated in step S305 and the blood vessel boundary or the lumen boundary.
  • Extract (S302).
  • the processing unit 30 may change the angle of the two straight lines with the center of the blood vessel as a reference according to the size and shape of the extracted portion.
  • FIG. 13A and 13B are explanatory diagrams of the angular range calculation process.
  • FIG. 13A shows an angular range from the center of the tomographic image I1 shown in FIG. 9 in which the existence probability is continuously equal to or higher than a predetermined probability value.
  • FIG. 13A shows a portion (plaque) having higher pixel values than others between the blood vessel boundary curve B2 and the lumen boundary curve B1 extracted in step S302.
  • the processing unit 30 may make adjustments in step S303 so that a straight line touches the extracted range, as indicated by the dashed line.
  • FIG. 13B shows the result of recalculating the angular range from the image center of the tomographic image I1 shown in FIG. 13A as the angular range from the blood vessel center.
  • the angular range from the image center of the tomographic image I1 is indicated by a dashed line.
  • the processing unit 30 can also recalculate the blood vessel boundary and the lumen boundary based on the angle range data with reference to the center of the blood vessel.
  • FIG. 14 is an explanatory diagram of the lumen boundary correction process.
  • FIG. 14A shows the lumen boundary curve B1 before correction
  • FIG. 14B shows the process during correction
  • FIG. 14C shows the lumen boundary curve B1 after correction.
  • a lumen boundary curve B1 is calculated as a region boundary as a result of performing segmentation on the tomographic image I1 (or the rectangular image I0) using the first model 31M.
  • the pixel value may be low, making it difficult to identify the region, and the accuracy of the calculated boundary may be low.
  • a lumen boundary curve B1 is calculated such that an ambiguous portion is recessed toward the center due to the influence of the range in which the lesion is photographed.
  • FIG. 14B shows the intersection of the straight line indicating the angular range in which the lesion or the medical device is likely to exist and the lumen boundary with respect to the image shown in FIG. 14A.
  • Processing unit 30 removes the lumen boundary between the two intersections.
  • FIG. 14C shows the lumen boundary after correction.
  • the corrected lumen boundary curve B1 shown in FIG. 14 is a spline curve connecting the intersection points shown in FIG. As a result, a naturally connected boundary is drawn even for a portion where the pixel value is low due to imaging a lesion or a medical instrument. Since blood vessels are elastic, membranes should form smooth curved surfaces and boundaries should be smooth, curves connected by splines may be more reproducible.
  • the correction process whose course is shown in FIG. 14 can be applied not only to the lumen boundary but also to the blood vessel boundary.
  • the correction of the lumen boundary curve B1 shown in FIG. 14C may be performed not only by a spline curve but also by an arc of a circle centered on the specified blood vessel center schreib.
  • a blood vessel has elasticity and naturally has a substantially circular cross section.
  • FIG. 15 shows an example of a screen 400 displayed on the display device 4.
  • a screen 400 shown in FIG. 15 includes a cursor 401 for selecting a position in the longitudinal direction of the blood vessel, and a tomographic image I1 at the position corresponding to the cursor 401.
  • Screen 400 includes a graph 402 of data indicative of anatomical features.
  • Graph 402 shows the distribution of mean lumen diameter and percentage of plaque coverage versus position on the longitudinal axis.
  • the tomographic image I1 displayed on the screen 400 of FIG. 15 shows an image (black circle, straight line, and dashed line) representing the angular range in which the lesion or the medical instrument exists, with reference to the center of the blood vessel.
  • image black circle, straight line, and dashed line
  • Below the tomographic image I1 on the screen 400 data indicating anatomical features calculated from the tomographic image I1 at that position and numerical values indicating the angular range are displayed.
  • the medical staff can visually recognize the results of rewriting the angles at which objects such as lesions or medical instruments are captured with reference to the center of the blood vessel on the screen 400. Become. As a result, the medical staff can visualize the extent of the lesion when inserting a catheter for treatment.
  • FIG. 16 is a diagram showing another screen example on the display device 4.
  • FIG. A screen 400 in FIG. 16 includes a three-dimensional image 403 of a blood vessel, and the three-dimensional image 403 includes a three-dimensional cursor 401 that is an object movable on the long axis.
  • the three-dimensional image 403 is superimposed and displayed with an image in which the angular ranges capturing the lesions specified for each tomographic image I1 are connected in the longitudinal direction. This makes it easier for medical personnel to stereoscopically perceive lesions or medical instruments in blood vessels.
  • FIG. 17 shows an example of calculating an angle that captures a lesion occurring only on the surface layer of the lumen.
  • FIG. 17A shows a lumen boundary curve B1 and a blood vessel boundary curve B2 obtained by inputting a tomographic image I1 showing dissection and protrusion into the first model 31M.
  • FIG. 17B shows an angular range in which a lesion or a medical instrument exists with reference to the center of the image obtained by inputting the tomographic image I1 to the second model 32M.
  • FIG. 17C shows the angular range relative to the vessel center.
  • the processing unit 30 extracts two straight lines from the center of the tomographic image I1 to both ends of the angle range and the lumen boundary curve B1 or The intersection of the blood vessel boundary with the curve B2 is calculated, and a straight line passing through the intersection and the center of the blood vessel is obtained to calculate the angular range from the center of the blood vessel. As a result, it is possible to obtain an angle for catching lesions that occur only on the luminal surface.
  • FIG. 18 shows an example of calculating the angle at which calcified plaque is captured.
  • FIG. 18A shows a lumen boundary curve B1 and a blood vessel boundary curve B2 obtained by inputting a tomographic image I1 showing a calcified plaque into the first model 31M.
  • FIG. 18B shows an angular range in which a lesion or a medical instrument exists with reference to the center of the image obtained by inputting the tomographic image I1 to the second model 32M.
  • FIG. 18C shows the angular range relative to the vessel center.
  • a calcified plaque is found between the intima of the blood vessel and the EEM. In this case, therefore, a lesion is seen between the lumen boundary and the vessel boundary.
  • the processing unit 30 determines whether the existence probability of the lesion calculated in step S301 is equal to or greater than a predetermined probability value, and determines the difference between the luminal boundary and the blood vessel boundary based on the pixel values on the tomographic image I1.
  • a portion corresponding to plaque is extracted (S302).
  • the processing unit 30 calculates the angular range from the blood vessel center that captures the extracted range. As a result, it is possible to obtain an angle that captures a lesion that occurs between the intimal surface layer and the blood vessel boundary (external elastic lamina).
  • correcting the blood vessel boundary is also effective.
  • a corrected boundary obtained by correcting the blood vessel boundary is shown by a thick circular arc.
  • Calcified plaque does not transmit ultrasonic waves, and there is a possibility that ultrasonic waves from the imaging device 11 will not reach the blood vessel boundary deeper than the plaque.
  • the area outside the calcified plaque becomes a dark area with considerably low pixel values, and the accuracy of segmentation by the first model 31M decreases. Therefore, as shown in FIG. 19C, two straight lines extending from the image center of the tomographic image I1 to both ends of the angle range and two points of intersection with the curve B2 of the blood vessel boundary are the points of intersection of the circle of the blood vessel center. It is effective to perform correction connecting with an arc or correction connecting with a spline curve. In this way, the angular range data that captures the lesion or medical device obtained from the second model 32M can also be used for boundary correction.
  • FIG. 19 shows an example of calculating the angle at which an attenuating plaque is captured.
  • FIG. 19A shows a blood vessel boundary curve B2 obtained by inputting a tomographic image I1 showing an attenuating plaque to the first model 31M.
  • FIG. 19B shows an angular range in which a lesion or a medical instrument exists with reference to the center of the image obtained by inputting the tomographic image I1 to the second model 32M.
  • FIG. 19C shows the angular range relative to the vessel center.
  • Attenuating plaques appear dark because they attenuate ultrasonic waves. In this case, since the pixel values appear low, the accuracy of region identification and boundary calculation is low.
  • the processing unit 30 determines the pixel value on the tomographic image I1 within a range in which the presence probability of the lesion calculated in step S301 is equal to or greater than a predetermined probability value and which is inside the blood vessel boundary. A portion corresponding to a plaque with a low V is extracted (S302). As shown in FIGS. 19B to 19C, the processing unit 30 calculates the angular range from the blood vessel center that captures the extracted range. It is also possible to correct the boundary (S306).
  • FIG. 20 shows an example of calculating an angle at which a stent applied to a blood vessel is captured.
  • FIG. 20A shows a lumen boundary curve B1 and a blood vessel boundary curve B2 obtained by inputting a tomographic image I1 showing a stent into the first model 31M.
  • FIG. 20B shows an angular range in which a lesion or a medical instrument exists with reference to the center of the image obtained by inputting the tomographic image I1 to the second model 32M.
  • FIG. 20C shows the angular range relative to the vessel center.
  • the stent may exist inside the lumen, or it may be incorporated into plaque that has grown from the lumen and exist between the lumen boundary and the blood vessel boundary.
  • the stent reflects ultrasound waves, so it appears bright and bright. Since the stent is mesh-like, the continuity of the range in which the existence probability of the lesion is equal to or higher than a predetermined probability value is not required.
  • the processing unit 30 determines whether the pixel value is high on the tomographic image I1 within the range in which the existence probability of the lesion calculated in step S301 is equal to or greater than a predetermined probability value and which is inside the blood vessel boundary. A part is extracted (S302).
  • the processing unit 30 may remove noise based on the size of the extracted portion from the cross-sectional size of the stent. As shown in FIGS. 20B to 20C, the processing unit 30 calculates the angular range from the center of the blood vessel capturing the extracted range to the mesh portion of each stent.
  • FIG. 21 shows an example of calculating the angle at which the guidewire W inserted together with the catheter 1 is captured.
  • FIG. 21A shows a lumen boundary curve B1 obtained by inputting a tomographic image I1 showing the guidewire W into the first model 31M.
  • FIG. 21B shows an angular range in which a lesion or a medical instrument exists with reference to the image center obtained by inputting the tomographic image I1 to the second model 32M.
  • FIG. 21C shows the angular range relative to the vessel center.
  • the guide wire W exists inside the lumen.
  • the processing unit 30 generates the tomographic image I1 within the range in which the presence probability of the lesion calculated in step S301 is equal to or greater than a predetermined probability value and which is inside the lumen boundary curve B1.
  • a portion with a low pixel value is extracted (S302).
  • the processing unit 30 can calculate the angular range that captures the guidewire W from the center of the blood vessel that captures the extracted range.
  • the second model 32M for outputting the angular distribution of the probability that a lesion or medical device exists, and the second model 32M for specifying a lumen boundary or a blood vessel boundary.
  • 1 model 31M can be used to obtain the angular range of the lesion at the center of the blood vessel.
  • one second model 32M can be used to obtain angular ranges for a variety of lesions and medical instruments, but to improve accuracy, lesion and medical It is preferable to learn the second model 32M for each instrument type.
  • the image processing device 3 connected to the catheter 1 generates a tomographic image I1 in substantially real time based on the signal from the imaging device 11, and calculates the angular range in which the lesion or the medical instrument exists.
  • the description has been given assuming that the display device 4 displays.
  • the processing by the image processing device 3 described above may be separately performed afterward based on signal data obtained from the imaging device 11 .
  • the image processing apparatus 3 is not necessarily directly connected to the imaging device 11 of the catheter 1, and may be able to acquire the signal from the imaging device 11.
  • the image processing device 3 may be a device such as a server device capable of reading out a storage device storing signals from the imaging device 11 via a network.
  • steps S101-S104 shown in the flowcharts of FIGS. 10 and 11 is performed by an existing processing device, and the processing of steps S006-S116 is performed by the image processing device 3 connected to the processing device. and may be displayed on the display device 4 via the processing device.
  • the medical image was explained by taking the image obtained by IVUS for the coronary arteries as an example.
  • the application is not limited to this, and may be OCT/OFDI or the like, and the hollow organ is not limited to blood vessels.
  • catheter 11 imaging device 3 image processing device (information processing device) 30 processing unit 31 storage unit 3P computer program 31M first model 32M second model 4 display device 400 screen I0 rectangular image (image based on scanning signal) I1 tomographic image (image based on scanning signal)

Abstract

La présente invention concerne un procédé de génération de modèle pour un modèle de sortie de données concernant un angle de reconnaissance d'une caractéristique anatomique d'un organe creux, un modèle d'apprentissage formé, un programme informatique utilisant le modèle d'apprentissage, un procédé de traitement d'informations et un dispositif de traitement d'informations. Ce procédé de génération de modèle implique : le fait d'amener un ordinateur à acquérir un signal de balayage par un dispositif d'imagerie disposé au niveau d'un cathéter qui est inséré au niveau d'un organe creux et qui se déplace dans le sens de la longueur de l'organe creux tout en tournant autour d'un axe délimitant le sens de la longueur, et pour générer un modèle d'apprentissage servant à émettre, lorsque une image de l'organe creux sur la base du signal de balayage a été entrée, une probabilité de présence d'une partie affectée par une lésion ou d'un instrument médical pour chaque angle de rotation du cathéter. Dans la présente invention, un modèle formé est utilisé conjointement à un modèle d'émission, au niveau de l'ordinateur, de données dans lesquelles l'image de l'organe creux basée sur le signal de balayage est identifiée comme régions différentes comprenant une cavité interne et une membrane de l'organe creux, et est utilisé pour effectuer un procédé de calcul d'une plage d'angles pour reconnaître une partie affectée par une lésion ou un instrument médical depuis le centre de l'organe creux.
PCT/JP2022/036157 2021-09-30 2022-09-28 Procédé de génération de modèle, modèle d'apprentissage, programme informatique, procédé de traitement d'informations et dispositif de traitement d'informations WO2023054467A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021161694 2021-09-30
JP2021-161694 2021-09-30

Publications (1)

Publication Number Publication Date
WO2023054467A1 true WO2023054467A1 (fr) 2023-04-06

Family

ID=85782839

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/036157 WO2023054467A1 (fr) 2021-09-30 2022-09-28 Procédé de génération de modèle, modèle d'apprentissage, programme informatique, procédé de traitement d'informations et dispositif de traitement d'informations

Country Status (1)

Country Link
WO (1) WO2023054467A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7365093B1 (ja) 2023-05-08 2023-10-19 康博 中島 医用画像処理装置、医用画像処理プログラムおよび医用画像処理方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010075616A (ja) * 2008-09-29 2010-04-08 Yamaguchi Univ スパースコーディングを用いた組織性状判別
JP2013543786A (ja) * 2010-11-24 2013-12-09 ボストン サイエンティフィック サイムド,インコーポレイテッド 身体内腔分岐を検出及び表示するためのシステム及び方法
JP2017503548A (ja) * 2013-12-20 2017-02-02 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. 自動超音波ビームステアリング及びニードルアーチファクト抑制
JP2017537768A (ja) * 2014-12-12 2017-12-21 ライトラボ・イメージング・インコーポレーテッド 血管内の特徴を検出し且つ表示するためのシステム及び方法
US20200226422A1 (en) * 2019-01-13 2020-07-16 Lightlab Imaging, Inc. Systems and methods for classification of arterial image regions and features thereof
WO2021039101A1 (fr) * 2019-08-27 2021-03-04 富士フイルム株式会社 Système endoscopique à ultrasons et procédé de fonctionnement de système endoscopique à ultrasons

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010075616A (ja) * 2008-09-29 2010-04-08 Yamaguchi Univ スパースコーディングを用いた組織性状判別
JP2013543786A (ja) * 2010-11-24 2013-12-09 ボストン サイエンティフィック サイムド,インコーポレイテッド 身体内腔分岐を検出及び表示するためのシステム及び方法
JP2017503548A (ja) * 2013-12-20 2017-02-02 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. 自動超音波ビームステアリング及びニードルアーチファクト抑制
JP2017537768A (ja) * 2014-12-12 2017-12-21 ライトラボ・イメージング・インコーポレーテッド 血管内の特徴を検出し且つ表示するためのシステム及び方法
US20200226422A1 (en) * 2019-01-13 2020-07-16 Lightlab Imaging, Inc. Systems and methods for classification of arterial image regions and features thereof
WO2021039101A1 (fr) * 2019-08-27 2021-03-04 富士フイルム株式会社 Système endoscopique à ultrasons et procédé de fonctionnement de système endoscopique à ultrasons

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7365093B1 (ja) 2023-05-08 2023-10-19 康博 中島 医用画像処理装置、医用画像処理プログラムおよび医用画像処理方法

Similar Documents

Publication Publication Date Title
JP7023715B2 (ja) 血管内のステントストラットカバレッジを決定するためのシステムの作動方法及びステント留置された領域を検出するための血管内画像化システムのプログラム可能なプロセッサベースのコンピュータ装置
JP5944917B2 (ja) 身体内腔分岐を検出及び表示するためのコンピュータ可読媒体及び同コンピュータ可読媒体を含むシステム
CN113544737A (zh) 用于动脉图像区域及其特征的分类的系统和方法
JP6913090B2 (ja) 血管内撮像及びガイドカテーテルの検出方法及びシステム
US20230020596A1 (en) Computer program, information processing method, information processing device, and method for generating model
WO2023054467A1 (fr) Procédé de génération de modèle, modèle d'apprentissage, programme informatique, procédé de traitement d'informations et dispositif de traitement d'informations
WO2022071264A1 (fr) Programme, procédé de génération de modèle, dispositif de traitement d'informations et procédé de traitement d'informations
JP2022055170A (ja) コンピュータプログラム、画像処理方法及び画像処理装置
US20230017227A1 (en) Program, information processing method, information processing apparatus, and model generation method
WO2022071265A1 (fr) Programme, et dispositif et procédé de traitement d'informations
JP2007289720A (ja) 超音波画像診断装置
WO2024071121A1 (fr) Programme informatique, procédé de traitement d'informations et dispositif de traitement d'informations
WO2021199961A1 (fr) Programme informatique, procédé de traitement d'informations, et dispositif de traitement d'informations
WO2022202323A1 (fr) Programme, procédé de traitement d'informations et dispositif de traitement d'informations
WO2024071251A1 (fr) Programme informatique, procédé de traitement d'informations, dispositif de traitement d'informations et modèle d'apprentissage
US20220028079A1 (en) Diagnosis support device, diagnosis support system, and diagnosis support method
JP2023051175A (ja) コンピュータプログラム、情報処理方法、及び情報処理装置
WO2022209657A1 (fr) Programme informatique, procédé de traitement d'informations et dispositif de traitement d'informations
JP2023049951A (ja) コンピュータプログラム、情報処理方法、及び情報処理装置
WO2023054442A1 (fr) Programme informatique, dispositif de traitement d'informations, et procédé de traitement d'informations
WO2022202320A1 (fr) Programme, procédé de traitement d'informations et dispositif de traitement d'informations
JP2024050056A (ja) コンピュータプログラム、学習モデル、情報処理方法、及び情報処理装置
WO2024071322A1 (fr) Procédé de traitement d'informations, procédé de génération de modèle d'apprentissage, programme informatique et dispositif de traitement d'informations
WO2022209652A1 (fr) Programme informatique, procédé de traitement d'informations et dispositif de traitement d'informations
US20240008849A1 (en) Medical system, method for processing medical image, and medical image processing apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22876333

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023551597

Country of ref document: JP