WO2023054366A1 - 医療画像処理システム - Google Patents
医療画像処理システム Download PDFInfo
- Publication number
- WO2023054366A1 WO2023054366A1 PCT/JP2022/035957 JP2022035957W WO2023054366A1 WO 2023054366 A1 WO2023054366 A1 WO 2023054366A1 JP 2022035957 W JP2022035957 W JP 2022035957W WO 2023054366 A1 WO2023054366 A1 WO 2023054366A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- disease
- information
- dimensional structure
- illumination light
- image
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/579—Depth or shape recovery from multiple images from motion
Definitions
- the present invention relates to a medical image processing system that acquires disease-related information from medical images.
- the purpose of the present invention is to provide a medical image processing system that can grasp information about diseases under geometric or spatial conditions.
- a medical image processing system of the present invention comprises a processor, and the processor acquires a plurality of medical images obtained by imaging a hollow organ from inside the lumen, and acquires three-dimensional structure information indicating the three-dimensional structure of the hollow organ. acquiring disease area information indicating a disease area in the three-dimensional structure of the hollow organ and disease evaluation information indicating an evaluation of the disease in the disease area from a plurality of medical images, obtaining the three-dimensional structure information, the disease area information, and The disease evaluation information is associated and output.
- the processor acquires a first illumination light image and a second illumination light image as a plurality of medical images, acquires three-dimensional structure information from the first illumination light image, and acquires disease region information from the first illumination light image or It is obtained from the second illumination light image, and the disease assessment information is preferably obtained from the second illumination light image.
- the first illumination light image and the second illumination light image are obtained by continuous photographing, and the wavelength band of the first illumination light used for generating the first illumination light image is the wavelength band used for generating the second illumination light image. 2 It is preferably wider than the wavelength band of the illumination light.
- the processor obtains the diseased region information using the sequential first illumination light image and the second illumination light image.
- a processor generates a base image and a texture image from a medical image for one frame, acquires three-dimensional structure information from the base image, acquires disease region information from the base image or the texture image, and acquires disease assessment information. , preferably from the texture image.
- the processor preferably acquires three-dimensional structural information using at least one of a plurality of medical images and an extraluminal image obtained by imaging the hollow organ from outside the lumen.
- the processor obtains the three-dimensional structural information by interpolating extraluminal images for regions of the hollow organ where no medical images exist.
- the processor displays on the display a three-dimensional structure in which the positions occupied by the diseased regions are indicated by disease indicating markers. It is preferable that the processor displays, of the three-dimensional structure displayed on the display, the region of the site not captured by the plurality of medical images, distinguishing it from the region of the site where the medical image exists. In addition to displaying the three-dimensional structure, the display preferably displays moving images based on medical images.
- the three-dimensional structure information is preferably a predetermined schematic image.
- the processor saves the three-dimensional structure information, the disease evaluation information, and the three-dimensional structure information set in which the disease evaluation information is associated with each other in the disease-related information memory, wherein the three-dimensional structure information at the time of the first examination is stored.
- the set is associated with the three-dimensional structure information set for the second examination after the first examination and stored in the disease-related information memory.
- the processor preferably displays the three-dimensional structure information set for the first inspection and the three-dimensional structure information set for the second inspection on the same screen of the display.
- the processor performs disease evaluation in the disease region during the first examination and disease evaluation in the disease region during the second examination from the three-dimensional structure information set obtained during the first examination and the three-dimensional structure information set obtained during the second examination. It is preferable to acquire difference information representing the difference between and display the difference information on the three-dimensional structure information.
- the disease is an inflammatory or neoplastic disease.
- information about diseases can be grasped under geometric or spatial conditions.
- FIG. 1 is a schematic diagram of an endoscope system;
- FIG. 1 is a block diagram showing functions of an endoscope system;
- FIG. 4 is an explanatory diagram of a first light emission mode;
- FIG. 10 is an explanatory diagram of a second light emission mode;
- FIG. 11 is an explanatory diagram of a third light emission mode;
- 3 is a block diagram showing functions of an image processing unit;
- FIG. 4 is an explanatory diagram showing acquisition of three-dimensional structure information from a first illumination light image;
- FIG. 4 is an explanatory diagram showing interpolation using an extraluminal image;
- FIG. 4 is an explanatory diagram showing acquisition of three-dimensional structure information and disease region information from first illumination light images obtained by continuous imaging;
- FIG. 4 is an explanatory diagram showing a disease display marker representing a disease area in a three-dimensional structure;
- FIG. 4 is an explanatory diagram showing an ellipse representing a diseased region in a three-dimensional structure;
- FIG. 10 is an explanatory diagram showing three-dimensional structural information in which regions where medical images exist and regions where they do not exist are displayed separately;
- FIG. 3 is an image diagram of a display displaying three-dimensional structural information during diagnosis.
- FIG. 10 is an image diagram of a display displaying three-dimensional structure information after diagnosis is completed;
- FIG. 10 is an image diagram of a display showing three-dimensional structure information during a first inspection and a second inspection;
- FIG. 10 is an image diagram of a display displaying difference information;
- FIG. 10 is an image diagram displaying an information display area regarding a specific region SPR;
- FIG. 10 is an image diagram displaying an information display area regarding a specific region SPR;
- FIG. 4 is an explanatory diagram showing acquisition of three-dimensional structure information, disease region information, and disease evaluation information from one frame of medical image;
- the endoscope system 10 includes an endoscope 12, a light source device 13, a processor device 14, a display 15 and a user interface 16.
- the endoscope 12 is optically or electrically connected to the light source device 13 and electrically connected to the processor device 14 .
- the endoscope system 10 corresponds to the medical image processing system of the present invention that processes images obtained by the endoscope 12 as medical images.
- the endoscope 12 has an insertion section 12a, an operation section 12b, a bending section 12c and a distal end section 12d.
- the insertion portion 12a is inserted into the body of the subject.
- the operation portion 12b is provided at the proximal end portion of the insertion portion 12a.
- the curved portion 12c and the distal end portion 12d are provided on the distal end side of the insertion portion 12a.
- the bending portion 12c is bent by operating the angle knob 12e of the operation portion 12b.
- the distal end portion 12d is directed in a desired direction by the bending motion of the bending portion 12c.
- a forceps channel (not shown) for inserting a treatment tool or the like is provided from the insertion portion 12a to the distal end portion 12d.
- the treatment instrument is inserted into the forceps channel from the forceps port 12j.
- An optical system for forming a subject image and an optical system for illuminating the subject with illumination light are provided inside the endoscope 12 .
- the operation unit 12b is provided with an angle knob 12e, a mode changeover switch 12f, a still image acquisition instruction switch 12h, and a zoom operation unit 12i.
- the mode changeover switch 12f is used for an observation mode changeover operation.
- a still image acquisition instruction switch 12h is used to instruct acquisition of a still image of a subject.
- the zoom operation unit 12i is used for operations for enlarging or reducing an observation target.
- the light source device 13 generates illumination light.
- the processor device 14 performs system control of the endoscope system 10 and further performs image processing on image signals transmitted from the endoscope 12 to generate medical images.
- the display 15 displays medical images transmitted from the processor device 14 .
- the user interface 16 has a keyboard, mouse, microphone, tablet, touch pen, and the like, and receives input operations such as function settings.
- the light source device 13 includes a light source section 20 and an optical path coupling section 22 .
- the light source unit 20 has a plurality of semiconductor light sources, which are turned on or off. Also, when lighting a plurality of semiconductor light sources, the illumination light for illuminating the subject is emitted by controlling the amount of light emitted from each semiconductor light source.
- the light source unit 20 includes a V-LED (Violet Light Emitting Diode) 20a, a B-LED (Blue Light Emitting Diode) 20b, a G-LED (Green Light Emitting Diode) 20c, and an R-LED (Red Light Emitting Diode) 20d. It has four color LEDs.
- the light source unit 20 may be built in the endoscope 12 .
- the V-LED 20a generates violet light V with a central wavelength of 405 ⁇ 10 nm and a wavelength range of 380-420 nm.
- the B-LED 20b generates blue light B with a central wavelength of 450 ⁇ 10 nm and a wavelength range of 420-500 nm.
- the G-LED 20c generates green light G with a wavelength range of 480-600 nm.
- the R-LED 20d emits red light R with a central wavelength of 620-630 nm and a wavelength range of 600-650 nm.
- the light emitted by each of the LEDs 20a to 20d is incident on the light guide 23 via the optical path coupling section 22 composed of mirrors, lenses, and the like.
- the light guide 23 propagates the light from the optical path coupling portion 22 to the distal end portion 12 d of the endoscope 12 .
- An illumination optical system 30 and an imaging optical system 32 are provided at the distal end portion 12 d of the endoscope 12 .
- the illumination optical system 30 has an illumination lens 31 , and the illumination light propagated by the light guide 23 is applied to the object via the illumination lens 31 .
- the light source unit 20 is built in the distal end portion 12d of the endoscope 12, the light is emitted toward the object through the illumination lens of the illumination optical system without the light guide.
- the imaging optical system 32 has an objective lens 35 , a zoom lens 36 and an imaging sensor 37 .
- Light from a subject irradiated with illumination light enters an imaging sensor 37 via an objective lens 35 and a zoom lens 36 .
- an image of the subject is formed on the imaging sensor 37 .
- the zoom lens 36 is a lens for enlarging a subject, and is moved between the tele end and the wide end by operating the zoom operation section 12i.
- the imaging sensor 37 is a primary color sensor, and includes B pixels (blue pixels) having blue color filters, G pixels (green pixels) having green color filters, and R pixels (red pixels) having red color filters. and three types of pixels.
- the imaging sensor 37 is preferably a CCD (Charge-Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor). In this embodiment, a CMOS image sensor is used as the image sensor 37, but a CCD image sensor may be used.
- the imaging processor 38 controls the imaging sensor 37 . Specifically, an image signal is output from the imaging sensor 37 by reading the signal of the imaging sensor 37 by the imaging processor 38 . The output image signal is sent to the processor device 14 .
- the processor device 14 has a medical image acquisition unit 40 , an image processing unit 41 , a display control unit 42 and a central control unit 43 .
- a program in a program memory (not shown) is operated by a central control unit 43 composed of a processor, thereby operating a medical image acquisition unit 40, an image processing unit 41, and a display control unit 42. function is realized.
- a three-dimensional structure information acquisition unit 50, a disease region information acquisition unit 51, a disease evaluation information acquisition unit 52, a disease-related information output unit 53, a difference information acquisition unit 57, and a specific region information output unit 59 included in the image processing unit 41 (see FIG. 6) is realized.
- the medical image acquisition unit 40 acquires image signals from the endoscope 12 as image signals of medical images.
- the image processing unit 41 performs defect correction processing, offset processing, demosaicing processing, matrix processing, white balance adjustment, gamma conversion processing, YC conversion processing, and other various signals on the image signals acquired by the medical image acquisition unit 40. process. Next, by performing image processing including 3 ⁇ 3 matrix processing, gradation conversion processing, color conversion processing such as three-dimensional LUT (Look Up Table) processing, color enhancement processing, and structural enhancement processing such as spatial frequency enhancement, A color medical image is generated.
- the image processing unit 41 acquires three-dimensional structure information, disease region information, and disease evaluation information. Acquisition of the three-dimensional structure information and the like will be described later.
- the display control unit 42 displays various information including the medical image generated by the image processing unit 41 on the display 15 .
- the endoscope system 10 has a mono light emission mode and a multi light emission mode as light emission modes for emitting illumination light.
- the light emission mode is switched via the central control unit 43 by operating the mode switch 12f.
- the mono light emission mode is a mode in which the illumination light of the same spectrum is continuously emitted to illuminate the object to be observed.
- the multi-light emission mode is a mode in which a plurality of illumination lights with different spectra are emitted while being switched according to a specific pattern to illuminate the subject.
- the illumination light includes a first illumination light L1 and a second illumination light L2 having a spectrum different from that of the first illumination light.
- the first illumination light L1 is preferably used for screening observation by giving brightness to the entire subject.
- the second illumination light L2 is preferably used to emphasize specific structures such as mucosal ducts and blood vessels, which are objects.
- the mono light emission mode either the first illumination light or the second illumination light is emitted.
- the first illumination light and the second illumination light are emitted while being switched according to a specific pattern.
- the first illumination light L1 is preferably broadband light such as white light.
- a second illumination light L2SP that emphasizes superficial blood vessels
- a second illumination light L2SQ that emphasizes superficial blood vessels that are shallower than superficial blood vessels
- a difference in absorption coefficient between oxygenated hemoglobin and reduced hemoglobin are used.
- a second illumination light L2SR for generating a combined oxygen saturation image and a second illumination light L2SS for generating a color difference extended image extending the color difference between a plurality of subject ranges are preferably included.
- the light amounts of the four colors of violet light V, blue light B, green light G, and red light R are independently controlled, and the light amounts are changed to produce the first illumination light L1 or the second illumination light L2 (for example, the second illumination light L2).
- 2 illumination light L2SP, second illumination light L2SQ, second illumination light L2SR, and second illumination light L2SS are emitted.
- Light emission control in the mono light emission mode and the multi light emission mode in the light source device 13 is performed by a light source processor (not shown).
- illumination light with the same spectrum is continuously emitted for each frame.
- the first illumination light image is displayed on the display 15 by illuminating the subject with the first illumination light for each frame and capturing the image.
- the second illumination light image is displayed on the display 15 by illuminating the subject with the second illumination light for each frame and capturing the image.
- a frame is a unit of period including at least a period from the timing of light emission to completion of readout of the image signal by the imaging sensor 37 .
- the LEDs 20a to 20d are controlled to automatically switch between the first illumination light and the second illumination light according to a specific light emission pattern. Specifically, control is performed to change the light amounts of the violet light V, the blue light B, the green light G, and the red light R for each specific frame F according to a specific light emission pattern.
- Examples of light emission patterns are given below.
- the first light emission pattern as shown in FIG. 3, two frames of the first illumination light L1 are emitted during the first light emission period Pe1 in which the subject is illuminated with the first illumination light L1, and the second illumination light L2 is emitted.
- the pattern of emitting the second illumination light L2 for one frame is repeated during the second light emission period Pe2 in which the subject is illuminated by .
- the same second illumination light L2SP is emitted in each second light emission period Pe2.
- the arrow indicates the direction in which time advances.
- the first light emission pattern two frames of the first illumination light image P1 are obtained in the first light emission period Pe1, and one frame of the second illumination light image P2SP is obtained in the second light emission period Pe2.
- the first illumination light L1 having a different spectrum may be emitted in each first light emission period Pe1 (the same applies to the second light emission pattern and the third light emission pattern described below).
- the first illumination light L1 is emitted for two frames during the first light emission period Pe1, and the second illumination light L2 is emitted for one frame during the second light emission period Pe2.
- the second illumination light L2 having a different spectrum is emitted in each second light emission period Pe2.
- the second illumination light L2SP and the second illumination light L2SQ are alternately emitted.
- two frames of the first illumination light image P1 are obtained in the first light emission period Pe1
- the second illumination light image P2SP and the second illumination light image P2SP are obtained in each of the second light emission periods Pe2.
- P2SQ is obtained.
- the first illumination light L1 is emitted for one frame during the first light emission period Pe1, and the second illumination light L2 is emitted for four frames during the second light emission period Pe2. repeat.
- the second illumination light L2SP, the second illumination light L2SQ, the second illumination light L2SR, and the second illumination light L2SS having different spectra are emitted for each frame.
- the first illumination light image P1 is obtained during the first light emission period Pe1
- the second illumination light images P2SP, P2SQ, P2SR, and P2SS are obtained during the second light emission period Pe2.
- four types of second illumination light L2 having different spectra are switched to emit light, but a plurality of types of second illumination light other than four types may be switched to emit light.
- the image processing unit 41 includes a three-dimensional structure information acquisition unit 50, a disease region information acquisition unit 51, a disease evaluation information acquisition unit 52, a disease-related information output unit 53, a disease-related information memory 55, a difference An information acquisition unit 57 and a specific area information output unit 59 are provided.
- a three-dimensional structure information acquisition unit 50 acquires three-dimensional structure information indicating the three-dimensional structure of a hollow organ.
- hollow organs include the esophagus, stomach, and large intestine. In this embodiment, the large intestine will be described as an example of a hollow organ.
- the three-dimensional structure information acquisition unit 50 acquires three-dimensional structure information from a first illumination light image P1, which is one of a plurality of medical images obtained by imaging a hollow organ from inside the lumen. Every time the first illumination light image is acquired, three-dimensional structure information is generated from the two-dimensional first illumination light image by a three-dimensional shape restoration technique (SfM (Structure from Motion)).
- SfM Structure from Motion
- a depth map image PD is acquired from the first illumination light image P1 by depth estimation processing. Then, the three-dimensional structure information TSI is acquired from the depth map image.
- the depth map image PD displays a depth distribution indicating the distance between each position of the observation target and the distal end portion 12d of the endoscope.
- the depth area PD1 is the area on the far side with the longest distance
- the depth area PD4 is the area on the front side with the shortest distance.
- the three-dimensional structure information acquisition unit 50 generates a learning model by machine learning or deep learning based on the first illumination light image and the three-dimensional structure information obtained from the first illumination light image.
- the first illumination light image may be input and three-dimensional structure information may be output.
- the three-dimensional structure information acquisition unit 50 as a method of acquiring three-dimensional structure information, in addition to the case of acquiring the three-dimensional structure information only from a plurality of medical images that are intraluminal images, Acquisition of three-dimensional structural information only from extraluminal images obtained by imaging from the outside and acquisition of three-dimensional structural information from both a plurality of medical images and extraluminal images are included.
- Extraluminal images include CT (Computed Tomography) images, MRI (Magnetic Resonance Imaging) images, ultrasound images, etc.
- Three-dimensional structural information can also be obtained from extraluminal images such as CT images. .
- three-dimensional structure information acquisition unit 50 acquires three-dimensional structure information using both a plurality of medical images and extraluminal images, at each position in the three-dimensional structure, the inside and outside of the lumen The presence of both images makes it possible to more accurately display the three-dimensional structure. Further, when the three-dimensional structure information acquisition unit 50 acquires three-dimensional structure information using both a plurality of medical images and an extraluminal image, for a part of the hollow organ where no medical image exists, , three-dimensional structural information may be obtained by interpolating the extraluminal images. For example, as shown in FIG.
- the rectal canal By interpolating the extraluminal image Pout, it is possible to acquire three-dimensional structural information of the entire large intestine including the rectal portion.
- past medical images acquired during past examinations may also be used as images used for interpolation.
- the disease region information acquisition unit 51 obtains a disease region indicating a disease region in the three-dimensional structure of the hollow organ from the first illumination light image P1 or the second illumination light image P2 obtained by continuous imaging as a plurality of medical images. Get information. Continuous imaging preferably includes both the first illumination light image and the second illumination light image in the number of frames that can be captured per second, for example.
- the disease is preferably an inflammatory disease such as ulcerative colitis or a neoplastic disease such as a tumor.
- the first illumination light image P1 or the second illumination light to the learning model It is preferable to output the disease area information by inputting the image P2.
- the disease area information may be input by the user by operating the user interface 16 .
- the disease region information is acquired using the first illumination light images P11 and P12 and the second illumination light image P21 obtained by continuous imaging.
- the first illumination light image P11 and the first illumination light image P11 are continuously obtained.
- the identification result of the first illumination light image P12 is "rectum"
- "rectum” is taken as the disease area information.
- both acquisition of three-dimensional structure information and anatomical identification may be performed from the first illumination light image P11 for one frame.
- a method for acquiring disease area information when the coordinates of the first illumination light image P1 and the coordinates of the second illumination light image P2 are common, the first illumination light image P1 and the second illumination light image P2 are aligned. sometimes.
- the wavelength band of the first illumination light used for generating the first illumination light image is the same as that for generating the second illumination light image. It is preferable that the wavelength band is wider than the wavelength band of the second illumination light used for. For example, it is preferable that the first illumination light is white light and the second illumination light is specific narrow band light. There is a high possibility that illuminating the observation target with broadband light will increase the accuracy of identification of the anatomical position.
- the anatomical position in the three-dimensional structure may be identified from the second illumination light image P2 instead of the first illumination light image P1. Further, the disease region information acquisition unit 51 may simply use position information in a three-dimensional structure as disease region information without identifying an anatomical position.
- the disease evaluation information acquisition unit 52 acquires disease evaluation information indicating the evaluation of the disease in the disease area from the second illumination light image P2.
- the disease evaluation information is preferably a pixel-by-pixel evaluation value of the second illumination light image P2 or a statistic based on a pixel-by-pixel evaluation value of the second illumination light image P2.
- the disease evaluation information includes, for example, Mayo indicating the degree of endoscopic inflammation, UCEIS, UCEIS-vascular see-through (UCEIS subscore), UCEIS bleeding degree, UCEIS-ulcer, endoscopic remission, pathological inflammation Geboes, which indicates degree, and pros and cons of pathological physiology.
- the disease evaluation information acquisition unit 52 associates the disease evaluation information with the disease area information obtained based on the first illumination light image captured continuously with the second illumination light image P2, and outputs the disease-related information output unit. 53, or preferably stored.
- the disease evaluation information acquisition unit 52 associates the disease area information "rectum” with the disease evaluation information "Mayo 2".
- a learning model obtained by machine learning or deep learning based on the second illumination light image P2 and the disease area information is used, and disease evaluation information is output by inputting the second illumination light image P2 to the learning model. preferably.
- the disease-related information output unit 53 outputs three-dimensional structure information, disease area information, and disease evaluation information in association with each other.
- the output destination is the display control unit 42 when displaying on the display 15, and the disease-related information memory 55 when saving.
- the display control unit 42 displays the output three-dimensional structure information, disease area information, and disease evaluation information on the display 15 .
- the display control unit 42 preferably displays on the display 15 a three-dimensional structure in which the positions occupied by the disease regions are displayed by the disease display markers DM.
- the display mode of the disease display marker DM is changed according to the contents of the disease evaluation information. For example, when the disease evaluation information is Mayo, the disease display marker DMr is "red” for Mayo3, the disease display marker DMg is "green” for Mayo2, and the disease display marker is "blue” for Mayo1.
- the disease display markers DMb and Mayo0 it is preferable to use a “light blue” disease display marker DMw (both are represented by hatching in FIG. 10).
- a “purple” disease display marker DMv for a disease region for which disease evaluation information has not been calculated.
- disease area information may be displayed in addition to disease evaluation information.
- each disease area information is grouped by an ellipse and displayed.
- oval EP1 indicates the rectum
- oval EP2 indicates the sigmoid colon
- oval EP3 indicates the descending colon.
- disease evaluation information is obtained for each site, and a disease display marker DM corresponding to the disease evaluation information for each site is displayed.
- a “red” disease indicating marker DMr is displayed on the rectum
- a “purple” disease indicating marker DMv is displayed on the sigmoid colon
- a “purple” disease indicating marker DMv is displayed on the descending colon.
- a “blue” disease display marker DMb is displayed.
- a region Rgx of a region not captured by a plurality of medical images is displayed, and a region Rgy of a region where a medical image exists. It is preferable to display them separately from Specifically, as shown in FIG. 12, when the region not captured by the first illumination light image P1 or the second illumination light image P2 is the rectum, it indicates that the rectum is the region Rgx. Other parts (other than the rectum) that are displayed with dotted lines and captured by the first illumination light image P1 or the second illumination light image P2 are displayed with solid lines indicating that they are regions Rgy. As for the rectal portion, it is preferable to switch from the dotted line display to the solid line display when the extraluminal image is interpolated.
- the three-dimensional structural information on the display 15 it is preferable to display a moving image in real time based on medical images. As shown in FIGS. 13 and 14, it is preferable to display three-dimensional structure information on the right half of the display 15 and to display a real-time moving image MV based on medical images on the left half. For example, in the case of FIG. 13, since the diagnosis is currently being made and the 3D structural information is being constructed sequentially, only part of the 3D structural information is displayed on the display 15 . On the other hand, in the case of FIG. 14, all the three-dimensional structure information is displayed on the display 15 because the diagnosis has been completed.
- the three-dimensional structure information may be obtained by a method other than the medical image obtained by imaging with the endoscope 12 .
- a schematic diagram image (schema diagram image) representing an average three-dimensional structure of a hollow organ such as the large intestine is determined in advance, and this predetermined schematic diagram image is acquired as three-dimensional structure information.
- this predetermined schematic diagram image is acquired as three-dimensional structure information.
- the disease-related information output unit 53 saves the three-dimensional structure information, etc.
- the three-dimensional structure information set in which the three-dimensional structure information, the disease region information, and the disease evaluation information are associated with each other is transferred to the disease-related information.
- the data is stored in the memory 55 for use.
- the disease evaluation information in the region X of the 3D structure information is "Mayo 1" by correlating and storing the 3D structure information and the disease evaluation information.
- the disease evaluation information "Mayo 2" is obtained, by storing the disease area information and the disease evaluation information in association with each other, it is possible to grasp that the disease evaluation information of "rectal” is "Mayo 2".
- the entire 3D structure information set is set as the 3D structure information set at the time of the first examination, and the 3D structure information set at the time of the first examination and the past diagnosis, etc. later than the time of the first examination. It is preferable to store the information in the disease-related information memory 55 in association with the three-dimensional structure information set at the time of the second examination.
- the display control unit 42 can display the first The three-dimensional structural information sets for the examination and the second examination are read from the disease-related information memory 55, and the three-dimensional structural information set TSI1 for the first examination and the structural information set TSI2 for the second examination are displayed. It is possible to display on 15 identical screens.
- the inspection date and time (“March 8, 2010” for TSI1 and “June 1, 2021” for TSI2) are displayed, It is also preferable to display the type of disease evaluation information currently displayed (“Mayo” for TSI1 and “UCEIS” for TSI2).
- the difference information acquisition unit 57 reads the three-dimensional structure information sets for the first examination and the second examination from the disease-related information memory 55, and obtains the disease area information and the disease evaluation information for the first examination and the disease evaluation information for the second examination. Difference information representing the difference between the disease evaluation in the disease area at the time of the first examination and the disease evaluation in the disease area at the time of the second examination is acquired from the disease area information and the disease evaluation information at the time. It is preferable to display the difference information in the three-dimensional structure information. Specifically, as shown in FIG. 16, the difference information DF includes the disease evaluation in the first examination and the disease evaluation in the second examination in the common area CM common to the disease area in the first examination and the disease area in the second examination. It is preferable to use the difference from the disease evaluation at the time of examination.
- the difference in the Mayo numerical level (1 to 4) between the first examination and the second examination is the difference in disease evaluation.
- the difference in disease evaluation is represented by gradation, and it is preferable that the greater the difference value, the darker the gradation (in FIG. 16, the gradation is represented by the interval of the hatching lines).
- the common region CM1 in the rectum has a large difference value (the interval between hatching lines is small) compared to the common regions CM2 to CM6 in other parts, so that the disease state is indicated. It can be seen that it has become worse or better.
- FIG. 16 indicates that the type of disease evaluation information displayed in the difference information DF is "Mayo".
- the specific region information output unit 59 When displaying the three-dimensional structure information on the display 15, the specific region information output unit 59 outputs the disease region information and the disease evaluation information corresponding to the specific region when the user designates the specific region of the three-dimensional structure. Displayed on the display 15 . The user operates the user interface 16 to give an instruction to the specific area. As shown in FIG. 17, the specific region SPR is specified by moving the pointer PT to the disease region for which the disease region information and disease evaluation information are to be displayed and performing a confirmation operation. The specific region SPR being designated is displayed in bold with the outer frame of the region. According to the specification of the specific region SPR, the specific region information output unit 59 reads the disease region information and the disease evaluation information corresponding to the specific region from the disease-related information memory 55 and displays them in the information display area AR.
- three-dimensional structure information, disease region information, and disease evaluation information are obtained using at least two frames of the first illumination light image P1 and the second illumination light image P2 obtained at different timings.
- the three-dimensional structure information, the disease region information, and the disease evaluation information may be obtained from one frame of medical image obtained at the same timing.
- the first illumination light image P1 such as a white light image as the medical image.
- a base image in which low frequency components are extracted and a high frequency and a texture image from which the components are extracted is an image that serves as a reference for brightness
- the texture image is an image from which the mucous membrane structure, blood vessel running pattern, and the like are extracted.
- the three-dimensional structural information is preferably obtained from the base image.
- Disease region information is preferably obtained from the base image or the texture image.
- Disease evaluation information is preferably obtained from the texture image.
- the base image Prior to obtaining the three-dimensional structural information, the base image is preferably subjected to brightness correction processing, and the texture image is preferably subjected to texture enhancement processing such as frequency component extraction processing or contrast enhancement processing.
- the medical image acquisition unit 40, the display control unit 42, the three-dimensional structure information acquisition unit 50, the disease region information acquisition unit 51, the disease evaluation information acquisition unit 52, the disease-related information output unit 53, the difference information acquisition unit 57 , and the specific area information output unit 59, the hardware structure of a processing unit that executes various processes is various processors as shown below.
- Various processors include CPU (Central Processing Unit), GPU (Graphical Processing Unit), FPGA (Field Programmable Gate Array), etc.
- Programmable Logic Device which is a processor whose circuit configuration can be changed after manufacturing, and a dedicated electric circuit, which is a processor with a circuit configuration specially designed to execute various processes. .
- One processing unit may be composed of one of these various processors, or a combination of two or more processors of the same or different type (for example, a plurality of FPGAs, a combination of CPU and FPGA, or a combination of CPU and A combination of GPUs, etc.).
- a plurality of processing units may be configured by one processor.
- configuring a plurality of processing units in one processor first, as represented by computers such as clients and servers, one processor is configured by combining one or more CPUs and software, There is a form in which this processor functions as a plurality of processing units.
- SoC System On Chip
- the various processing units are configured using one or more of the above various processors as a hardware structure.
- the hardware structure of these various processors is, more specifically, an electric circuit in the form of a combination of circuit elements such as semiconductor elements.
- the hardware structure of the storage unit is a storage device such as an HDD (hard disc drive) or SSD (solid state drive).
- endoscope system 12 endoscope 12a insertion portion 12b operation portion 12c bending portion 12d tip portion 12e angle knob 12f mode changeover switch 12h still image acquisition instruction switch 12i zoom operation portion 12j forceps port 13 light source device 14 processor device 15 display 16 User interface 20 Light source unit 20a V-LED 20b B-LED 20c G-LED 20d R-LED 22 Optical path coupling unit 23 Light guide 30 Illumination optical system 31 Illumination lens 32 Imaging optical system 35 Objective lens 36 Zoom lens 37 Imaging sensor 38 Imaging processor 40 Medical image acquiring unit 41 Image processing unit 42 Display control unit 43 Central control unit 50 3 Dimensional structure information acquisition unit 51 Disease region information acquisition unit 52 Disease evaluation information acquisition unit 53 Disease-related information acquisition unit 55 Disease-related information memory 57 Difference information acquisition unit 59 Specific region information output units CM1 to CM6 Common region DF Difference information DM, DMv, DMb, DMg, DMr, DMw Disease display markers EP1 to EP3 Ellipse L1 First illumination light L2, L2SP, L2SQ,
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Radiology & Medical Imaging (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Optics & Photonics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biophysics (AREA)
- Endoscopes (AREA)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023551528A JPWO2023054366A1 (enrdf_load_stackoverflow) | 2021-10-01 | 2022-09-27 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021162783 | 2021-10-01 | ||
JP2021-162783 | 2021-10-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023054366A1 true WO2023054366A1 (ja) | 2023-04-06 |
Family
ID=85782719
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/035957 WO2023054366A1 (ja) | 2021-10-01 | 2022-09-27 | 医療画像処理システム |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2023054366A1 (enrdf_load_stackoverflow) |
WO (1) | WO2023054366A1 (enrdf_load_stackoverflow) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012050606A (ja) * | 2010-08-31 | 2012-03-15 | Fujifilm Corp | 内視鏡画像処理装置および方法、並びに、プログラム |
JP2012152284A (ja) * | 2011-01-24 | 2012-08-16 | Hoya Corp | 画像処理装置、電子内視鏡用プロセッサ装置、画像処理方法 |
JP2012170774A (ja) * | 2011-02-24 | 2012-09-10 | Fujifilm Corp | 内視鏡システム |
JP2015107268A (ja) * | 2013-12-05 | 2015-06-11 | 国立大学法人名古屋大学 | 内視鏡観察支援装置 |
JP2015136397A (ja) * | 2014-01-20 | 2015-07-30 | 富士フイルム株式会社 | 内視鏡システム用プロセッサ装置、内視鏡システム、内視鏡システム用プロセッサ装置の作動方法、内視鏡システムの作動方法 |
JP2017536212A (ja) * | 2014-12-08 | 2017-12-07 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 血管内デバイスによって測定されたコレジストレーションされた血管造影画像及び生理学的情報に基づいて診断処置を推奨するためのデバイス及び方法 |
WO2018043550A1 (ja) * | 2016-08-31 | 2018-03-08 | Hoya株式会社 | 電子内視鏡用プロセッサ及び電子内視鏡システム |
-
2022
- 2022-09-27 JP JP2023551528A patent/JPWO2023054366A1/ja active Pending
- 2022-09-27 WO PCT/JP2022/035957 patent/WO2023054366A1/ja active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012050606A (ja) * | 2010-08-31 | 2012-03-15 | Fujifilm Corp | 内視鏡画像処理装置および方法、並びに、プログラム |
JP2012152284A (ja) * | 2011-01-24 | 2012-08-16 | Hoya Corp | 画像処理装置、電子内視鏡用プロセッサ装置、画像処理方法 |
JP2012170774A (ja) * | 2011-02-24 | 2012-09-10 | Fujifilm Corp | 内視鏡システム |
JP2015107268A (ja) * | 2013-12-05 | 2015-06-11 | 国立大学法人名古屋大学 | 内視鏡観察支援装置 |
JP2015136397A (ja) * | 2014-01-20 | 2015-07-30 | 富士フイルム株式会社 | 内視鏡システム用プロセッサ装置、内視鏡システム、内視鏡システム用プロセッサ装置の作動方法、内視鏡システムの作動方法 |
JP2017536212A (ja) * | 2014-12-08 | 2017-12-07 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 血管内デバイスによって測定されたコレジストレーションされた血管造影画像及び生理学的情報に基づいて診断処置を推奨するためのデバイス及び方法 |
WO2018043550A1 (ja) * | 2016-08-31 | 2018-03-08 | Hoya株式会社 | 電子内視鏡用プロセッサ及び電子内視鏡システム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2023054366A1 (enrdf_load_stackoverflow) | 2023-04-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6785941B2 (ja) | 内視鏡システム及びその作動方法 | |
JP7125484B2 (ja) | 内視鏡システム | |
JP6831463B2 (ja) | 医療画像処理装置、内視鏡システム、診断支援装置、並びに医療業務支援装置 | |
JP7289296B2 (ja) | 画像処理装置、内視鏡システム及び画像処理装置の作動方法 | |
JP7289373B2 (ja) | 医療画像処理装置、内視鏡システム、診断支援方法及びプログラム | |
JP7335399B2 (ja) | 医用画像処理装置及び内視鏡システム並びに医用画像処理装置の作動方法 | |
US20180344129A1 (en) | Endoscope processor and operation method of endoscope processor | |
EP4183311A1 (en) | Image analysis processing device, endoscopy system, operation method for image analysis processing device, and program for image analysis processing device | |
WO2022014077A1 (ja) | 内視鏡システム及びその作動方法 | |
US20230237659A1 (en) | Image processing apparatus, endoscope system, operation method of image processing apparatus, and non-transitory computer readable medium | |
JP2021045337A (ja) | 医用画像処理装置、プロセッサ装置、内視鏡システム、医用画像処理方法、及びプログラム | |
JP7130043B2 (ja) | 医用画像処理装置及び内視鏡システム並びに医用画像処理装置の作動方法 | |
JP7702294B2 (ja) | 内視鏡システム及びその作動方法 | |
JP7386347B2 (ja) | 内視鏡システム及びその作動方法 | |
WO2023054366A1 (ja) | 医療画像処理システム | |
JP6731065B2 (ja) | 内視鏡システム及びその作動方法 | |
WO2023132188A1 (ja) | 内視鏡システム及びその作動方法 | |
WO2021006121A1 (ja) | 画像処理装置、内視鏡システム、及び画像処理装置の作動方法 | |
JP2023007331A (ja) | 内視鏡システム、医療画像処理装置及びその作動方法 | |
WO2021149140A1 (ja) | 内視鏡装置、制御装置、内視鏡装置の作動方法及びプログラム | |
WO2023058503A1 (ja) | 内視鏡システム、医療画像処理装置及びその作動方法 | |
WO2022059233A1 (ja) | 画像処理装置、内視鏡システム、画像処理装置の作動方法、及び画像処理装置用プログラム | |
JP7556961B2 (ja) | 内視鏡システム、内視鏡システムの作動方法、及び内視鏡システム用プログラム | |
JP6999597B2 (ja) | プロセッサ装置及びその作動方法並びに内視鏡システム | |
US20240013392A1 (en) | Processor device, medical image processing device, medical image processing system, and endoscope system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22876232 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023551528 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22876232 Country of ref document: EP Kind code of ref document: A1 |