WO2020170809A1 - Dispositif de traitement d'image médicale, système d'endoscope et procédé de traitement d'image médicale - Google Patents

Dispositif de traitement d'image médicale, système d'endoscope et procédé de traitement d'image médicale Download PDF

Info

Publication number
WO2020170809A1
WO2020170809A1 PCT/JP2020/004224 JP2020004224W WO2020170809A1 WO 2020170809 A1 WO2020170809 A1 WO 2020170809A1 JP 2020004224 W JP2020004224 W JP 2020004224W WO 2020170809 A1 WO2020170809 A1 WO 2020170809A1
Authority
WO
WIPO (PCT)
Prior art keywords
medical image
image
real
image processing
information
Prior art date
Application number
PCT/JP2020/004224
Other languages
English (en)
Japanese (ja)
Inventor
駿平 加門
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Priority to JP2021501830A priority Critical patent/JPWO2020170809A1/ja
Publication of WO2020170809A1 publication Critical patent/WO2020170809A1/fr
Priority to JP2022200252A priority patent/JP2023026480A/ja

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof

Definitions

  • the present invention relates to a medical image processing apparatus, an endoscope system, and a medical image processing method for highlighting a medical image.
  • Patent Document 1 describes that an alert image is displayed according to the size and number of attention areas.
  • the preferred highlighting mode differs depending on the nature of the image, so it is necessary to switch appropriately. For example, a user can be notified of lesion detection by superimposing a figure or the like on a region of interest detected in an image acquired in real time during the screening test. On the other hand, the user saves the image of the lesion found during the examination and uses it for report creation and research. In addition, the saved image is displayed again for observation when necessary. When an image acquired in non-real time is displayed in this way, it is preferable to set the highlighted display of the attention area in a mode different from that in real time display, because the superimposed display of figures and the like can hinder observation and diagnosis. However, such a situation is not taken into consideration in Patent Document 1 described above.
  • the present invention has been made in view of such circumstances, and an object thereof is to provide a medical image processing apparatus, an endoscope system, and a medical image processing method capable of highlighting a medical image according to real-time property.
  • the medical image processing apparatus includes a medical image acquisition unit that acquires a medical image, an information acquisition unit that acquires information on a region of interest in the medical image, and information. Based on the, the mode setting unit that sets the highlighted area to the display mode according to whether the medical image is a real-time image acquired in real time or a non-real time image acquired in non-real time, and A display control unit that causes the display device to display the image in a set display mode, and a recording control unit that causes the recording device to record the medical image and information in association with each other.
  • the medical image processing apparatus can highlight a medical image according to real-time property (whether the image is a real-time image acquired in real time or a non-real time image acquired in non-real time). ..
  • the medical image and the information are recorded in the recording device in association with each other (the information is recorded without being superimposed on the image), the medical image and the information are used afterwards (in the case of a non-real-time image). Can be displayed in a display mode according to the purpose of observation and the wishes of the user.
  • the “display mode setting” includes whether to set highlighting to on or off and how to display the attention area information.
  • Information can be displayed by letters, numbers, figures, symbols, and combinations thereof, and may be colored.
  • “real time” and “real time image” include, for example, a substantial delay of a plurality of medical images taken at a predetermined frame rate in an ongoing examination or observation. Include sequentially acquiring and displaying without, and images so sequentially displayed. A case where the acquisition or display of an image is slightly delayed compared to the complete real time due to image transmission/reception, image processing, or the like is also included in “real time” or “real time image”.
  • “non-real time” and “non-real time image” include, for example, a case where a still image is acquired and displayed during and after the inspection.
  • the number of display devices is not particularly limited. Both the real-time image and the non-real-time image may be displayed on the same display device. In this case, for example, a real-time image can be displayed and a specific frame can be freeze-displayed according to a user's instruction or the like. Alternatively, a display device for real-time images and a display device for non-real-time images may be separately provided, and images may be displayed on the respective display devices. When a separate monitor is provided, for example, a real-time image may be displayed on a certain display device, and the image acquired by the inspection may be displayed on another display device in a non-real time, for example, when creating a report. Both aspects are included in “displaying medical image on display device”.
  • the medical image processing apparatus can be realized as, for example, a processor of a medical image processing system, but is not limited to such an aspect.
  • the “medical image” refers to an image obtained as a result of photographing and measuring a living body such as a human body for the purpose of diagnosis, treatment, measurement, etc., for example, an endoscopic image, an ultrasonic image, a CT image (CT image). : Computed Tomography) and MRI images (MRI: Magnetic Resonance Imaging). Medical images are also called medical images.
  • the aspect setting unit turns on or off highlighting of the attention area.
  • the information of the attention area can be notified in real time by highlighting, and it is possible to prevent a lesion or the like from being overlooked.
  • highlighting or the like interferes with diagnosis, it may be set to off. Whether to set it to on or off may be determined based on the user's operation, or may be determined regardless of the user's operation.
  • the aspect setting unit sets the highlighting of the attention area to ON or OFF.
  • An image with information superimposed is not suitable for reporting or diagnosis, while information on the region of interest is necessary when looking back at the image after inspection (oversight prevention, research use, etc.).
  • highlighting is turned on or off in the case of a non-real-time image, so that highlighting can be performed according to the purpose of use of the image. It can be set to ON or OFF according to a user's operation, whereby the user can display a medical image in a desired display mode.
  • the medical image processing apparatus is the medical image processing apparatus according to any one of the first to third aspects, wherein the medical image acquisition unit acquires the medical image recorded in the recording device as a non-real time image, and the information acquisition unit. Acquires information recorded in association with a non-real time image.
  • the medical image acquisition unit may acquire a medical image from a recording device connected via a network.
  • the medical image processing apparatus is the medical image processing apparatus according to any one of the first to fourth aspects, including a detector that detects a region of interest from the medical image, and the information acquisition unit provides information based on a detection result of the detector.
  • the medical image to be detected may be a real-time image or a non-real-time image.
  • the detector can be configured by a trained model.
  • the medical image processing apparatus is the medical image processing apparatus according to any one of the first to fifth aspects, including a classifier that classifies the medical images, and the information acquisition unit acquires information from a classification result of the classifier.
  • the medical image to be classified may be a real-time image or a non-real-time image.
  • the classifier can be composed of a trained model.
  • the medical image processing apparatus is the medical image processing apparatus according to any one of the first to sixth aspects, including a measuring instrument that measures a medical image, and the information acquisition unit obtains information from a measurement result of the measuring instrument.
  • the medical image to be measured may be a real-time image or a non-real-time image.
  • the measuring instrument can be configured by a learned model.
  • the medical image processing apparatus is the medical image processing apparatus according to any one of the first to seventh aspects, including a reception unit that receives a setting operation of a display mode by a user, and the mode setting unit is based on the received setting operation. Set the display mode.
  • the user can display the medical image in a desired display mode (a highlighted area highlighting mode).
  • the medical image processing apparatus is the information of the first elapsed time according to any one of the first to eighth aspects, in which the medical image showing the same attention area is displayed on the display device. Is provided, and the mode setting unit sets the display mode based on the information of the first elapsed time.
  • a ninth aspect is, for example, in the case of sequentially acquiring and displaying real-time images at a predetermined frame rate, according to the elapsed time (first elapsed time) of the state in which the same region of interest appears in a plurality of frames. It can be applied when setting the display mode. In the ninth mode, the display mode can be changed (including ON/OFF switching) according to the first elapsed time.
  • a medical image processing apparatus is the medical image processing apparatus according to any one of the first to ninth aspects, wherein the second medical information processing apparatus acquires the second elapsed time information in which the same medical image is displayed on the display device.
  • a time information acquisition unit is provided, and the mode setting unit sets the display mode based on the information on the second elapsed time.
  • the display mode In the tenth mode, for example, when a non-real-time image is observed by freeze display (fixed display), the display mode is set according to the elapsed time (second elapsed time) from displaying the image. Can be applied to.
  • the display mode can be changed (including ON/OFF switching) according to the second elapsed time.
  • a medical image processing apparatus is the medical image processing apparatus according to any one of the first to tenth aspects, wherein the display control unit causes the display device to display a list of a plurality of medical images recorded in the recording device.
  • the unit sets a common mode for highlighting a plurality of medical images displayed in a list.
  • the display mode For each image, but in the eleventh mode, the highlighted mode is set to a common mode, so the operation is simple. This setting can be performed according to the user's operation.
  • a common aspect is ON or OFF of highlighting for a plurality of medical images. According to the twelfth aspect, it is possible to collectively turn on or off the highlighting for a plurality of medical images.
  • a medical image processing apparatus according to a thirteenth aspect is the medical image processing apparatus according to any one of the first to twelfth aspects, in which the aspect setting unit sequentially acquires medical images in an ongoing examination or an ongoing observation. In this case, the medical image is judged to be a real-time image.
  • the thirteenth aspect can be applied to ongoing examination or ongoing observation.
  • a medical image processing device is the medical image processing apparatus according to any one of the first to thirteenth aspects, further including a recording unit, and the aspect setting unit displays the medical image of the images already recorded in the recording unit. In some cases, the medical image is determined to be the non-real time image.
  • the fourteenth aspect can be applied to medical images already recorded in the recording unit.
  • the medical image processing apparatus according to a fifteenth aspect further includes a communication control unit according to any one of the first to fourteenth aspects, and the aspect setting unit acquires the medical image via the communication control unit. In the case of the acquired image, the medical image is determined as a non-real time image.
  • the fifteenth aspect can be applied to a medical image acquired via the communication control unit.
  • an endoscope system provides a medical image processing apparatus according to any one of the first to fifteenth aspects, a display device, and a subject.
  • An endoscope scope to be inserted the endoscope scope having a photographing unit for photographing a medical image.
  • the endoscope system according to the sixteenth aspect includes the medical image processing apparatus according to any one of the first to fifteenth aspects, it is possible to perform emphasis processing according to the real-time property of the medical image.
  • the real-time image can be taken by the imaging unit of the endoscope.
  • the number of display devices in the sixteenth aspect is not particularly limited, as in the first aspect described above. Both the real-time image and the non-real-time image may be displayed on the same display device, or the display device for the real-time image and the display device for the non-real-time image may be separately provided, and the image may be displayed on each display device. Good.
  • a medical image processing method is a medical image acquisition step of acquiring a medical image, an information acquisition step of acquiring information on a region of interest in the medical image, and information. Based on, the mode setting step of setting the highlighted region in the display mode according to whether the medical image is a real-time image acquired in real time or a non-real-time image acquired in non-real time, and A display control step of displaying an image on a display device in a set display mode, and a recording control step of associating a medical image with information and recording them in a recording device.
  • the seventeenth aspect similarly to the first aspect, it is possible to highlight the medical image according to the real-time property.
  • the medical image processing method according to the eighteenth aspect may further include the same configuration (steps) as in the second to fifteenth aspects. Further, a program for causing a medical image processing apparatus or a computer to execute the medical image processing method of these aspects, and a non-transitory recording medium recording a computer-readable code of the program can also be mentioned as an aspect of the present invention.
  • a medical image can be highlighted according to real-time property.
  • FIG. 1 is a diagram showing the configuration of the endoscope system according to the first embodiment.
  • FIG. 2 is another diagram showing the configuration of the endoscope system.
  • FIG. 3 is a functional block diagram of the image processing unit.
  • FIG. 4 is a diagram showing a configuration example of a convolutional neural network.
  • FIG. 5 is a diagram showing a state of convolution processing by a filter.
  • FIG. 6 is a diagram showing information recorded in the recording unit.
  • FIG. 7 is a flowchart showing the procedure of the medical image processing method according to the first embodiment.
  • FIG. 8 is a diagram showing an example of preferable highlighting for a real-time image.
  • FIG. 9 is a diagram showing an example of unfavorable highlighting on a real-time image.
  • FIG. 1 is a diagram showing the configuration of the endoscope system according to the first embodiment.
  • FIG. 2 is another diagram showing the configuration of the endoscope system.
  • FIG. 3 is a functional block diagram of the image
  • FIG. 10 is a diagram showing a display example of part information.
  • FIG. 11 is another flowchart showing the procedure of the medical image processing method.
  • FIG. 12 is a diagram showing an example of highlighting according to the elapsed time.
  • FIG. 13 is another flowchart showing the procedure of the medical image processing method.
  • FIG. 14 is a diagram showing how highlighting of a plurality of images displayed in a list is collectively turned on or off.
  • FIG. 1 is an external view of an endoscope system 10 (medical image processing apparatus, endoscope system), and FIG. 2 is a block diagram showing a main configuration of the endoscope system 10.
  • an endoscope system 10 includes an endoscope scope 100 (medical device, endoscope scope, endoscope body), endoscope processor device 200 (medical image processing device), light source. It is composed of a device 300 (light source device) and a monitor 400 (display device).
  • An external device (not shown) for acquiring information on the observation site by electromagnetic waves or ultrasonic waves may be connected to the endoscope system 10.
  • the endoscope scope 100 includes a hand operation unit 102 and an insertion unit 104 that is connected to the hand operation unit 102.
  • An operator grasps and operates the hand operation unit 102, inserts the insertion unit 104 into the body of the subject (living body), and observes it.
  • the hand operation unit 102 is provided with an air/water supply button 141, a suction button 142, a function button 143 to which various functions are assigned, and a photographing button 144 for accepting a photographing instruction operation (still image, moving image). ..
  • the insertion section 104 is composed of a flexible section 112, a bending section 114, and a distal end hard section 116 in this order from the hand operation section 102 side. That is, the bending portion 114 is connected to the proximal end side of the distal end hard portion 116, and the flexible portion 112 is connected to the proximal end side of the bending portion 114.
  • the proximal operation unit 102 is connected to the proximal end side of the insertion unit 104. The user can bend the bending portion 114 by operating the hand-side operating portion 102 to change the direction of the distal end hard portion 116 up, down, left or right.
  • An imaging optical system 130, an illumination unit 123, a forceps opening 126, and the like are provided in the distal end hard section 116 (see FIGS. 1 and 2).
  • white light and/or narrow band light (red narrow band light, green narrow band light, from the illumination lenses 123A and 123B of the illumination unit 123 is operated by operating the operation unit 208 (see FIG. 2).
  • One or more of blue narrow band light and purple narrow band light can be irradiated.
  • cleaning water is discharged from a water supply nozzle (not shown) to clean the photographing lens 132 (photographing lens, photographing unit) of the photographing optical system 130 and the illumination lenses 123A and 123B.
  • a conduit (not shown) communicates with the forceps port 126 that opens at the distal end hard portion 116, and a treatment tool (not shown) for tumor removal or the like is inserted through this conduit and advances or retreats appropriately to the subject. They are able to take the necessary measures.
  • a photographing lens 132 (imaging unit) is arranged on the distal end side end surface 116A of the distal rigid portion 116.
  • a CMOS (Complementary Metal-Oxide Semiconductor) type image pickup device 134 image pickup device, image pickup unit
  • a drive circuit 136 drive circuit 136
  • AFE 138 Analog FrontEnd, image pickup unit
  • the image signal is output by the element of.
  • the image pickup element 134 is a color image pickup element, and is composed of a plurality of light receiving elements arranged in a matrix (two-dimensional arrangement) in a specific pattern arrangement (Bayer arrangement, X-Trans (registered trademark) arrangement, honeycomb arrangement, etc.). A plurality of pixels.
  • Each pixel of the image sensor 134 includes a microlens, a red (R), a green (G), or a blue (B) color filter and a photoelectric conversion unit (photodiode or the like).
  • the photographing optical system 130 can also generate a color image from pixel signals of three colors of red, green, and blue, or generate an image from pixel signals of any one color of red, green, and blue.
  • the image pickup device 134 is a CMOS type image pickup device
  • the image pickup device 134 may be a CCD (Charge Coupled Device) type.
  • Each pixel of the image sensor 134 may further include a purple color filter corresponding to the purple light source 310V and/or an infrared filter corresponding to the infrared light source.
  • the optical image of the subject is formed on the light receiving surface (image pickup surface) of the image pickup element 134 by the taking lens 132, converted into an electric signal, and output to the endoscope processor device 200 via a signal cable (not shown) to be an image. Converted to a signal. As a result, the endoscopic image is displayed on the monitor 400 connected to the endoscopic processor device 200.
  • illumination lenses 123A and 123B of the illumination unit 123 are provided adjacent to the taking lens 132 on the distal end side end surface 116A of the distal rigid portion 116.
  • An exit end of a light guide 170 which will be described later, is disposed inside the illumination lenses 123A and 123B, and the light guide 170 is inserted into the insertion section 104, the hand operation section 102, and the universal cable 106, and the light guide 170 The incident end is arranged in the light guide connector 108.
  • the user performs imaging at a predetermined frame rate while inserting or removing the endoscope scope 100 (insertion unit 104) having the above-described configuration into or out of a living body as a subject (control of the imaging unit and the medical image acquisition unit 220). It is possible to sequentially capture in-vivo images.
  • the light source device 300 includes an illumination light source 310, a diaphragm 330, a condenser lens 340, a light source controller 350, and the like, and makes observation light incident on the light guide 170.
  • the light source 310 includes a red light source 310R, a green light source 310G, a blue light source 310B, and a violet light source 310V that respectively radiate red, green, blue, and violet narrowband light, and includes red, green, blue, and violet narrow light sources. Band light can be emitted.
  • the illuminance of the observation light by the light source 310 is controlled by the light source control unit 350, and the illuminance of the observation light can be changed (increased or lowered) and the illumination can be stopped as necessary.
  • the light source 310 can emit red, green, blue, and violet narrowband light in any combination. For example, it is possible to emit red, green, blue, and violet narrowband light at the same time to irradiate white light (normal light) as observation light, or to emit any one or two narrowband lights. Light (special light) can also be emitted.
  • the light source 310 may further include an infrared light source that emits infrared light (an example of narrow band light). Further, white light or narrow band light may be irradiated as observation light by a light source that emits white light and a filter that transmits white light and each narrow band light.
  • the light source 310 may be a white band light, a light source that emits light of a plurality of wavelength bands as white band light, or a light source that emits light of a specific wavelength band narrower than the white wavelength band.
  • the specific wavelength band may be a visible blue band or a green band, or a visible red band.
  • the specific wavelength band is the visible blue band or green band, it includes a wavelength band of 390 nm or more and 450 nm or less, or 530 nm or more and 550 nm or less, and peaks within the wavelength band of 390 nm or more and 450 nm or 530 nm or 550 nm or less It may have a wavelength.
  • the specific wavelength band when the specific wavelength band is the visible red band, it includes a wavelength band of 585 nm or more and 615 nm or less, or 610 nm or more and 730 nm or less, and the light of the specific wavelength band is 585 nm or more and 615 nm or less or 610 nm or more. It may have a peak wavelength within a wavelength band of 730 nm or less.
  • the light of the specific wavelength band described above includes a wavelength band having a different absorption coefficient between oxyhemoglobin and reduced hemoglobin, and the oxyhemoglobin and the reduced hemoglobin may have a peak wavelength in a different wavelength band absorption coefficient.
  • the specific wavelength band includes a wavelength band of 400 ⁇ 10 nm, 440 ⁇ 10 nm, 470 ⁇ 10 nm, or 600 nm or more and 750 nm, and 400 ⁇ 10 nm, 440 ⁇ 10 nm, 470 ⁇ 10 nm, or 600 nm or more and 750 nm. You may have a peak wavelength in the following wavelength bands.
  • the light generated by the light source 310 may include a wavelength band of 790 nm or more and 820 nm or less, or 905 nm or more and 970 nm or less, and may have a peak wavelength in a wavelength band of 790 nm or more and 820 nm or less or 905 nm or more and 970 nm or less.
  • the light source 310 may include a light source that emits excitation light having a peak of 390 nm or more and 470 nm or less.
  • a medical image medical image, in-vivo image
  • a dye for fluorescence method fluorestin, acridine orange, etc.
  • the light source type of the light source 310 (laser light source, xenon light source, LED light source (LED: Light-Emitting Diode), etc.), wavelength, presence/absence of a filter, etc. are preferably configured according to the type of the subject, the site, the purpose of observation, etc. In addition, it is preferable to combine and/or switch the wavelengths of the observation light according to the type of the subject, the site, the purpose of the observation, and the like during observation. When switching the wavelength, for example, by rotating a disk-shaped filter (rotary color filter) provided in front of the light source and provided with a filter that transmits or blocks light of a specific wavelength, the wavelength of the light to be irradiated is switched. Good.
  • a disk-shaped filter rotary color filter
  • the image pickup device used when implementing the present invention is not limited to the color image pickup device in which the color filter is arranged for each pixel like the image pickup device 134, and may be a monochrome image pickup device.
  • a monochrome image pickup device it is possible to sequentially switch the wavelength of the observation light and pick up an image in a field sequential (color sequential) manner.
  • the wavelength of the emitted observation light may be sequentially switched between (purple, blue, green, red), or broadband light (white light) may be emitted to rotate the rotary color filter (red, green, blue, purple, etc.).
  • the wavelength of the observation light emitted by may be switched.
  • the narrow band light may be infrared light (first narrow band light, second narrow band light) of two or more wavelengths having different wavelengths.
  • the observation light emitted from the light source device 300 is transmitted to the illumination lenses 123A and 123B via the light guide 170, and the illumination lens.
  • the observation range is irradiated from 123A and 123B.
  • the configuration of the endoscope processor device 200 will be described with reference to FIG.
  • the endoscope processor device 200 inputs the image signal output from the endoscope scope 100 via the image input controller 202, performs necessary image processing in the image processing unit 204, and outputs the image signal via the video output unit 206. To do. As a result, an endoscopic image (medical image) is displayed on the monitor 400 (display device). These processes are performed under the control of the CPU 210 (CPU: Central Processing Unit).
  • the communication control unit 205 communicates with a hospital image system (HIS: Hospital Information System) (not shown), a hospital LAN (Local Area Network), and/or information about a medical image and a region of interest with an external system or network. Control communication.
  • HIS Hospital Information System
  • An image of the subject (endoscopic image, medical image, medical image), attention area information, display mode setting information, and the like are recorded in the recording unit 207 (recording device) (see FIG. 6 and related description). ..
  • the voice processing unit 209 Under the control of the CPU 210 and the image processing unit 204, the voice processing unit 209 outputs a message (voice) regarding the detection of the attention area and the detection result from the speaker 209A.
  • the ROM 211 (ROM: Read Only Memory) is a non-volatile storage element (non-temporary recording medium), and executes various image processing methods on the CPU 210 and/or the image processing unit 204 (medical image processing device, computer).
  • the computer-readable code of the program to be stored is stored.
  • a RAM 212 (RAM: Random Access Memory) is a storage element for temporary storage during various types of processing, and can also be used as a buffer when acquiring an image.
  • the user can give an instruction to execute medical image processing and a condition necessary for execution through the operation unit 208, and the display control unit 228 (see FIG. 3) displays the screen at the time of these instructions and the detection result. Etc. can be displayed on the monitor 400.
  • FIG. 3 is a functional block diagram of the image processing unit 204.
  • the image processing unit 204 includes a medical image acquisition unit 220 (medical image acquisition unit), an information acquisition unit 222 (information acquisition unit), and a mode setting unit 226. (Aspect setting unit), display control unit 228 (display control unit), recording control unit 230 (recording control unit), reception unit 232 (reception unit), and first time information acquisition unit 234 (first The time information acquisition unit) and the second time information acquisition unit 235 (second time information acquisition unit).
  • the information acquisition unit 222 includes a recognizer (detector 224A (detector), classifier 224B (classifier), and measuring device 224C (measuring device)). Details of medical image processing using these functions will be described later.
  • the image processing unit 204 uses the above-described functions to calculate a feature amount of a medical image, a process of emphasizing or reducing a component of a specific frequency band, or emphasizing a specific target (region of interest, blood vessel of desired depth, etc.) It is possible to perform processing to make it inconspicuous.
  • the image processing unit 204 is a special light that acquires a special light image having information of a specific wavelength band based on a normal light image obtained by irradiating light of a plurality of wavelength bands as white band light or white band light.
  • An image acquisition unit may be provided.
  • the signal of the specific wavelength band is converted into RGB (R: red, G: green, B: blue) or CMY (C: cyan, M: magenta, Y: yellow) color information included in the normal light image. It can be obtained by calculation based on Further, the image processing unit 204 includes a white band light, a normal light image obtained by irradiating light of a plurality of wavelength bands as white band light, and a special light image obtained by irradiating light of a specific wavelength band. A feature amount image generation unit that generates a feature amount image by calculation based on at least one of the above may be provided, and the feature amount image as a medical image (medical image) may be acquired and displayed. The above-mentioned processing is performed under the control of the CPU 210.
  • the functions of the respective units of the image processing unit 204 described above can be realized by using various processors and recording media.
  • the various processors include, for example, a CPU (Central Processing Unit) that is a general-purpose processor that executes software (programs) to realize various functions.
  • the above-mentioned various processors include programmable logic devices (GPUs, Graphics Processing Units), FPGAs (Field Programmable Gate Arrays), etc. Programmable Logic Device (PLD) is also included.
  • PLD Programmable Logic Device
  • a configuration using a GPU is effective.
  • a dedicated electric circuit which is a processor having a circuit configuration specifically designed to execute a specific process such as an ASIC (Application Specific Integrated Circuit), is also included in the various processors described above.
  • each unit may be realized by one processor, or a plurality of processors of the same type or different types (for example, a plurality of FPGAs, a combination of CPU and FPGA, or a combination of CPU and GPU).
  • a plurality of functions may be realized by one processor.
  • a processor As an example of configuring a plurality of functions with a single processor, firstly, as represented by a computer, one processor is configured with a combination of one or more CPUs and software, and this processor has a plurality of functions. There are forms to realize. Secondly, there is a form in which a processor that realizes the functions of the entire system by one IC (Integrated Circuit) chip is used, as represented by a system on chip (SoC).
  • SoC system on chip
  • various functions are configured by using one or more of the various processors described above as a hardware structure.
  • the hardware structure of these various processors is an electrical circuit in which circuit elements such as semiconductor elements are combined.
  • These electric circuits may be electric circuits that realize the above-described functions by using logical sum, logical product, logical NOT, exclusive logical sum, and logical operation that combines these.
  • the processor or electric circuit described above executes software (program), it can be read by a computer of the software (for example, various processors and electric circuits forming the image processing unit 204, and/or a combination thereof).
  • the code is stored in a non-transitory recording medium such as a ROM 211 (ROM: Read Only Memory) and the computer refers to the software.
  • the software stored in the non-temporary recording medium is a program for executing the medical image processing method according to the present invention and data used in execution (data used for specifying a display mode, detector 224A, classifier 224B). , And the parameters used in the measuring instrument 224C).
  • the code may be recorded on a non-transitory recording medium such as various magneto-optical recording devices and semiconductor memories.
  • a non-transitory recording medium such as various magneto-optical recording devices and semiconductor memories.
  • the RAM 212 RAM: Random Access Memory
  • EEPROM Electrically Erasable and Programmable Read Only Memory
  • the recording unit 207 may be used as the “non-transitory recording medium”.
  • the recognizer (detector 224A, classifier 224B, and measuring instrument 224C) described above is a trained model such as CNN (Convolutional Neural Network) and SVM (Support Vector Machine) (image set composed of images of a living body). Can be configured by using a model learned by using.
  • CNN Convolutional Neural Network
  • SVM Small Vector Machine
  • FIG. 4 is a diagram showing an example of the layer structure of CNN.
  • the CNN 562 includes an input layer 562A, an intermediate layer 562B, and an output layer 562C.
  • the input layer 562A inputs the endoscopic image (medical image) acquired by the medical image acquisition unit 220 and outputs the feature amount.
  • the middle layer 562B includes a convolutional layer 564 and a pooling layer 565, and inputs the feature amount output from the input layer 562A to calculate another feature amount.
  • These layers have a structure in which a plurality of "nodes" are connected by "edges" and hold a plurality of weighting parameters.
  • the value of the weight parameter changes as the learning progresses.
  • the CNN 562 may include a fully-bonded layer 566 as in the example shown in part (b) of FIG.
  • the layer configuration of the CNN 562 is not limited to the case where the convolutional layer 564 and the pooling layer 565 are repeated one by one, and any one of the layers (for example, the convolutional layer 564) may be continuously included. Further, a plurality of all-bonded layers 566 may be continuously included.
  • the intermediate layer 562B calculates the feature amount by the convolution operation and the pooling process.
  • the convolution calculation performed in the convolution layer 564 is a process of acquiring a feature map by a convolution calculation using a filter, and plays a role of feature extraction such as edge extraction from an image. By performing a convolution operation using this filter, a "feature map" of one channel (one sheet) is generated for one filter. The size of the "feature map” is downscaled by the convolution and becomes smaller as the convolution is performed on each layer.
  • the pooling process performed in the pooling layer 565 is a process of reducing (or expanding) the feature map output by the convolution operation to obtain a new feature map, and the extracted features are not affected by translation or the like. Play a role in providing robustness to.
  • the intermediate layer 562B can be configured by one or a plurality of layers that perform these processes.
  • FIG. 5 is a schematic diagram showing a configuration example of the intermediate layer 562B of the CNN 562 shown in FIG.
  • the convolution operation between the image set (a learning image set during learning and a recognition image set during recognition) including a plurality of medical images and the filter F 1 is performed.
  • the image set is composed of N (N-channel) images each having an image size of H in the vertical direction and W in the horizontal direction.
  • the images forming the image set are three-channel images of R (red), G (green), and B (blue).
  • the filter F 1 to be convolved with this image set is a filter of 5 ⁇ 5 ⁇ N.
  • a “feature map” of one channel (one sheet) is generated for one filter F 1 .
  • the filter F 2 used in the second convolutional layer is, for example, a filter of size 3 (3 ⁇ 3), the filter size is 3 ⁇ 3 ⁇ M.
  • the convolutional operations using the filters F 2 to F n are performed in the second to nth convolutional layers.
  • the size of the "feature map" in the nth convolutional layer is smaller than the size of the "feature map” in the second convolutional layer because it is downscaled by the convolutional layer or pooling layer up to the preceding stage. Is.
  • low-order feature extraction (edge extraction, etc.) is performed in the convolutional layer close to the input side, and higher-order feature extraction (features related to the shape, structure, etc. of the target object) is performed toward the output side. Extraction) is performed.
  • segmentation is performed for the purpose of measurement or the like, upscaling is performed in the convolutional layer in the latter half part, and in the final convolutional layer, a “feature map” having the same size as the input image set is obtained.
  • up-scaling is not essential when detecting an object, because position information may be output.
  • the intermediate layer 562B may include a layer that performs batch normalization in addition to the convolutional layer 564 and the pooling layer 565.
  • the batch normalization process is a process for normalizing the distribution of data in units of mini-batch at the time of learning, and has a role to accelerate learning, reduce dependency on an initial value, suppress overlearning, and the like.
  • the output layer 562C is a layer that detects the position of the attention area reflected in the input medical image (normal light image, special light image) based on the characteristic amount output from the intermediate layer 562B and outputs the result. is there.
  • the output layer 562C grasps the position of the attention area shown in the image at the pixel level from the “feature map” obtained from the intermediate layer 562B. That is, it is possible to detect whether each pixel of the endoscopic image belongs to the attention area and output the detection result.
  • determination at the pixel level is not necessary, and the output layer 562C outputs the position information of the target object.
  • the output layer 562C executes the discrimination (classification) regarding the lesion and outputs the discrimination result.
  • the output layer 562C classifies endoscopic images into three categories of "neoplastic”, “non-neoplastic", and “other”, and distinguishes “neoplastic”, “non-neoplastic", and “other”. May be output as the three scores corresponding to (the total of the three scores is 100%), or the classification result may be output if the three scores can be clearly classified.
  • the output layer 562C may or may not include the fully-bonded layer as the last one layer or a plurality of layers (see (b) part of FIG. 4). ..
  • the output layer 562C outputs the measurement result of the attention area.
  • the target region of interest can be segmented as described above, and then measured by the image processing unit 204 or the like based on the result. Further, it is possible to directly output the measurement value of the target attention area from the measuring instrument. When the measured value is directly output, the measured value itself is learned for the image, which is a regression problem of the measured value.
  • the result output by the output layer 562C is compared with the correct answer of the recognition for the image set to calculate the loss (error), and the intermediate layer 562B is reduced so as to reduce the loss. It is preferable to perform the processing (error back propagation) of updating the weighting parameter in (3) from the layer on the output side to the layer on the input side.
  • the recognizers may perform recognition (detection of a region of interest, etc.) by a method other than CNN.
  • the attention area can be detected based on the feature amount of the pixels of the acquired medical image.
  • the detector 224A divides the detection target image into, for example, a plurality of rectangular regions, sets each of the divided rectangular regions as a local region, and for each local region of the detection target image, the feature amount of the pixel in the local region ( For example, a hue) is calculated, and a local area having a specific hue is determined as a target area from each local area.
  • classification and measurement may be performed based on the feature amount.
  • all the detectors 224A, the classifiers 224B, and the measuring instruments 224C may be operated in parallel as the recognizers to switch the display result, or one or two recognizers may be used. It may be operated and the result may be displayed. Further, the endoscope system 10 may include one or two of the recognizers instead of all of the detector 224A, the classifier 224B, and the measuring instrument 224C.
  • FIG. 6 is a diagram showing an example of information recorded in the recording unit 207.
  • the endoscopic image 260 medical image
  • the recognition result 262 detection result, classification result, measurement result
  • display mode setting information 264 display mode when highlighting the attention area
  • the recording control unit 230 records these pieces of information in association with each other.
  • the reception unit 232 operates the user via the operation unit 208, and/or the display mode setting information 264 recorded in the recording unit 207 (for example, whether highlighting is turned on or off, or displayed).
  • the conditions necessary for executing the medical image processing method are set on the basis of default display modes such as selection of figures and symbols to be performed (step S100: initial setting step).
  • the image acquisition mode whether real-time image acquisition or non-real-time image acquisition
  • characters, figures, symbols used for highlighting, and their colors are set.
  • the conditions may be set or changed during execution of the following steps.
  • “real-time” and “real-time image” are, for example, substantially a plurality of medical images taken at a predetermined frame rate in an ongoing examination or observation. Sequential acquisition without significant delay, and images so acquired. A case where image acquisition is slightly delayed compared to complete real time due to image transmission/reception, image processing, etc. is also included in “real time” and “real time image”.
  • “non-real time” and “non-real time image” include, for example, a case where a plurality of medical images already recorded after the examination are sequentially acquired and displayed.
  • the medical image acquisition unit 220 acquires an endoscopic image (medical image) taken in the living body of the subject (step S110: medical image acquisition step).
  • the medical image acquisition unit 220 uses the imaging unit (the imaging lens 132, the imaging element 134, the AFE 138, etc.) of the endoscope 100 (medical device) to sequentially capture the inside of the living body that is the subject at a predetermined frame rate. Then, the endoscopic image can be acquired in real time (real-time image acquisition).
  • the medical image acquisition unit 220 may acquire an endoscopic image that has already been captured and recorded in non-real time (acquisition of non-real time image).
  • the endoscopic image 260 recorded in the recording unit 207 may be acquired, or the image may be acquired from an external device or system via the communication control unit 205.
  • the site information indicating the observation site of the subject has already been acquired, even if an endoscopic image (medical image) taken with observation light in a wavelength band corresponding to the site indicated by the site information is acquired.
  • an image taken with white light in the case of the stomach and an image taken with blue narrow band light in the case of the esophagus can be acquired.
  • the display control unit 228 causes the monitor 400 to display the acquired endoscopic image.
  • the detector 224A detects the attention area from the endoscopic image (medical image) (step S120: information acquisition step).
  • the detector 224A grasps the position of the attention area shown in the image by the “feature map” at the pixel level (that is, the attention area is determined for each pixel of the endoscopic image. Whether or not it belongs) can be detected and the detection result can be output.
  • regions of interest to be detected include polyps, cancer, large intestine diverticulum, inflammation, treatment scars (EMR: Endoscopic Mucosal Resection), ESD scars (ESD: Endoscopic Submucosal Dissection), clip locations, etc.), Examples include bleeding points, perforations, and vascular atypia.
  • the classifier 224B and the measuring device 224C may perform classification and measurement in step S120.
  • the endoscopic image (medical image) to be detected, classified, and measured may be a real-time image or a non-real-time image.
  • the aspect setting unit 226 determines whether or not the image acquired in step S110 is a real-time image (step S130: determination step, aspect setting step). Based on this determination result, the mode setting unit 226 sets the display mode for the real-time image when it is a real-time image (YES in step S130) (step S140: mode setting step), and when it is a non-real-time image ( If NO in step S130, the display mode for the non-real-time image is set (step S150: mode setting step).
  • the display control unit 228 displays the endoscopic image (medical image) on the monitor 400 (display device) according to the display mode (mode of highlighting the attention area) set in step S140 or S150 (step S160: display). Control step).
  • FIG. 8 is a diagram showing a display example in the case of a real-time image (when highlighting is set to ON).
  • Portions (b) and (c) of the figure show a state in which the attention area 802 in the endoscopic image 800 is highlighted by a graphic 804 and an arrow 806 (superimposed in the endoscopic image 800). These highlights directly indicate the position of the region of interest, but do not obstruct the observation of the mucosal structure because no graphic is superimposed on the region of interest itself.
  • part (d) of the figure shows a state in which the frame 808 is displayed in a part outside the screen (outside the display range of the endoscopic image 800 in the display area of the monitor 400).
  • FIG. 8A is a diagram (for reference) showing a state in which highlighting is not performed, and shows an endoscopic image 800 in which a region of interest 802 is shown.
  • FIG. 9 is a diagram showing a display example in the case of a non-real-time image (when highlighting is set to ON).
  • the part (a) of the figure shows a state in which the attention area 802 is filled, and the part (b) shows the endoscopic image 820 and the recognition result display image 822 on two screens, and the attention area 802 is displayed in the recognition result display image 822.
  • a state in which a symbol 824 is displayed at a position corresponding to is shown.
  • the display mode of FIG. 9 is not suitable for real-time display because it may hinder observation, but display after storage (including a case where an image recorded in the recording unit 207 or the like is acquired and displayed in non-real time) causes a problem.
  • the display after storage observes the attention area in detail
  • the display method in which the position of the attention area cannot be accurately grasped like the portions (d) to (f) of FIG. 8 is not preferable.
  • a display size per image may be reduced in order to display a list of the image groups acquired during the inspection, more prominent emphasis processing is preferred. Therefore, the aspect as shown in FIG. 9 can be adopted.
  • the display position and/or size of the observation screen may be changed. By doing so, it is possible to display an observation screen that fully utilizes the monitor size.
  • FIGS. 8 and 9 are merely examples of display modes. 9 may be displayed in the case of a real-time image or may be displayed in the case of a non-real-time image as shown in FIG. Further, although FIG. 9 shows an example in which highlighting is turned on for a non-real-time image, highlighting may be turned off in response to a user operation or when a predetermined condition is satisfied. Also in the case of a real-time image, highlighting may be turned off in response to a user's operation or when a predetermined condition is satisfied. In addition to the display modes illustrated in FIGS. 8 and 9, color tone and gradation changes, frequency processing, and the like may be performed. Such setting of the display mode can be performed by the receiving unit 232 receiving a user operation (display mode setting operation) via the operation unit 208, and the mode setting unit 226 based on the setting operation.
  • a user operation display mode setting operation
  • mode setting unit 226 based on the setting operation.
  • the mode setting unit 226 may set the display mode based on the classification result and the measurement result of the endoscopic image, and the display control unit 228 may perform the display according to the setting.
  • the classification result of the endoscopic image can be set as “neoplastic”, “non-neoplastic”, and “other”, and a display mode using characters, figures, symbols, etc. can be set and displayed.
  • You may change a color according to a classification result.
  • the measurement result may be displayed as a numerical value such as “diameter 5 mm”, or a figure, a symbol, or the like indicating the size of the attention area may be displayed. Also in this case, the color may be changed according to the result.
  • the number of monitors (display devices) in the endoscope system 10 is not particularly limited. Both the real-time image and the non-real-time image may be displayed on the monitor 400. In this case, for example, a real-time image can be displayed and a specific frame can be freeze-displayed according to a user's instruction or the like. Alternatively, a monitor for real-time images and a monitor for non-real-time images may be separately provided, and the images may be displayed on the respective monitors. When a separate monitor is provided, for example, a real-time image may be displayed on the monitor 400, and the image acquired by the inspection may be displayed on another monitor in non-real time when creating a report. Both aspects are included in "displaying an endoscopic image (medical image) on a display device".
  • the display control unit 228 may display the part information (information indicating the part shown in the endoscopic image) together with the highlighted display of the endoscopic image.
  • FIG. 10 is a diagram showing a display example of part information.
  • Part (a) of FIG. 10 is an example in which part information 836 is displayed in characters (“gastric”) in addition to the frame 834 surrounding the region of interest 832 in the endoscopic image 830, and part (b) of FIG. Is an example in which site information is displayed in an endoscopic image 840 by an arrow 842A on a schematic diagram 842.
  • the part information may be displayed with other icons, and the display mode (contents of characters and symbols, position, shape, color, etc.) may be changed according to the part.
  • the site information can be acquired by analyzing an endoscopic image, input by a user, or an external device using electromagnetic waves or ultrasonic waves.
  • the medical image acquisition unit 220 acquires the endoscopic image (medical image) captured by the observation light in the wavelength band corresponding to the part shown in the endoscopic image, and the display control unit 228 determines the observation light in the wavelength band.
  • the result of recognition of the medical image photographed in (3) may be displayed on the monitor 400 (display device). For example, an image taken with white light (normal light) in the case of the stomach and an image taken with special light (blue narrow band light) such as BLI (Blue Laser Imaging: registered trademark) in the case of the esophagus can be recognized. Can be offered.
  • LCI Linked Color Imaging: registered trademark
  • image processing in the case of LCI, the saturation difference and hue difference of colors close to the mucous membrane color are extended. You may use.
  • the image processing according to the part can be performed by the image processing unit 204.
  • the recording control unit 230 (recording control unit) records the endoscopic image and the information of the attention area in the recording unit 207 as the endoscopic image 260 and the recognition result 262, respectively (step S170: recording control step). If the classification or measurement is performed in step S120, the result is recorded. Further, when the endoscopic image is subjected to image processing, the processed endoscopic image can be recorded, and the display mode setting information 264 is recorded together with the endoscopic image 260 and the recognition result 262. Good. These pieces of information are preferably recorded in association with each other.
  • the image processing unit 204 determines whether to end the process (step S180), and if the process is to be continued (NO in step S180), the process returns to step S110, and the above-described process is performed for the next frame, for example.
  • the endoscope system 10 can highlight medical images according to real-time characteristics. Further, since the medical image and the information of the attention area are associated with each other and recorded in the recording device (the information is recorded without being superimposed on the image), when the medical image and the information are used afterwards (in the case of a non-real time image) Can be displayed in a display mode according to the purpose of observation and the wishes of the user.
  • FIG. 11 is a flowchart showing a display mode setting process based on the information of the elapsed time. Since steps other than steps S152 to S156 are the same as those in FIG. 7, detailed description thereof will be omitted.
  • the first time information acquisition unit 234 calculates the elapsed time during which the endoscopic image (medical image) showing the same region of interest is displayed on the monitor 400 (display device) ( First elapsed time) information is acquired (step S152: time information acquisition step), and the mode setting unit 226 determines whether or not the first elapsed time is equal to or greater than a threshold value (step S154: determination). Step).
  • the first time information acquisition unit 234 can acquire (calculate) information on the elapsed time based on, for example, the frame rate of shooting and the number of frames in which the same attention area is reflected.
  • the mode setting unit 226 resets the display mode based on the information of the first elapsed time (step S156: mode setting step), When the elapsed time of 1 is less than the threshold value (NO in step S154), the display mode is maintained.
  • FIG. 12 is a diagram showing a state in which the display mode is reset according to the first elapsed time.
  • a graphic 804A surrounding the attention area 802 is displayed by a dotted line, and the degree of emphasis is low.
  • the mode setting unit 226 and the display control unit 228 can switch, for example, the state of the part (b) of FIG. 8 (the graphic 804 is displayed by a solid line) and the state of FIG. 12 according to the first elapsed time.
  • the first elapsed time when the first elapsed time is short, it can be displayed as shown in part (b) of FIG. 8 to raise the degree of emphasis to call attention, and thereafter, the degree of emphasis can be lowered as shown in FIG.
  • the pattern may be displayed in the opposite pattern depending on the user's operation or the purpose of observation. Further, instead of changing the line type of the figure as shown in FIGS. 8 and 12, the shape and size of the figure and the symbol, the color and the brightness may be changed.
  • highlighting may be turned on or off depending on the elapsed time, or the degree of emphasis may be increased (or decreased). For example, highlighting may be turned off at the beginning of detection and then turned on (or the degree of emphasis may be increased with the first elapsed time), or vice versa (or the degree of emphasis may be decreased with the first elapsed time). ) Is OK. Further, it may be repeatedly turned on and off at predetermined time intervals.
  • Such processing is performed, for example, when the endoscopic images are sequentially acquired and displayed at a predetermined frame rate and the elapsed time (first elapsed time) is the state in which the same region of interest appears in a plurality of frames. It can be applied when the display mode is set accordingly.
  • the display mode when the attention area is highlighted may be set based on information on the elapsed time during which the same endoscopic image is displayed.
  • the processing in this case can be performed in the same manner as the flowchart of FIG.
  • the second time information acquisition unit 235 acquires information on the elapsed time (second elapsed time) during which the same endoscopic image is displayed on the monitor 400 (display device). Then, (step S152: time information acquisition step), the mode setting unit 226 determines whether the second elapsed time is equal to or greater than the threshold value (step S154: determination step).
  • the mode setting unit 226 resets the display mode based on the information of the second elapsed time (step S156: mode setting step), When the elapsed time of 2 is less than the threshold value (NO in step S154), the display mode is maintained.
  • the display mode can be the same as that based on the first elapsed time.
  • Such processing is performed by changing the display mode in accordance with the elapsed time (second elapsed time) during which a specific image is continuously displayed, for example, when the endoscope image is continuously observed by freeze display or the like. It can be applied when setting.
  • the display control unit 228 causes the monitor 400 (display device) to display a list of a plurality of medical images recorded in the recording unit 207 (recording device) (step S160: display control step).
  • step S160 display control step.
  • Part (a) of FIG. 14 shows a state in which six images are displayed with the highlighting being off, and the attention area 802 is detected in the endoscopic images 850 and 852.
  • the mode setting unit 226 determines whether there is a change operation (step S162: determination step).
  • the mode setting unit 226 can determine that “there is a change operation” when the user performs a change operation via the operation unit 208 and the reception unit 232 receives the operation.
  • the mode setting unit 226 sets the highlighted mode for the plurality of medical images displayed in a list to the common mode. For example, highlighting of a plurality of endoscopic images (medical images) is collectively set to on (or off).
  • the display control unit 228 redisplays the displayed endoscopic images in the changed display mode (common mode) (step S164: display control step).
  • the display control unit 228 displays frames 854 and 856 with respect to the endoscopic images 850 and 852 in which the attention area 802 is detected, for example, as shown in part (b) of FIG. 14 (endoscopic images 850 and 852). For the other images, the frame is not displayed because the attention area has not been detected).
  • the "common aspect” is that "a frame surrounding the entire image is displayed for the image showing the region of interest". With such a display, the user can easily select an image in which the region of interest is reflected (that is, an image that requires close examination) from a large number of images after the inspection.
  • the emphasis method may be changed between the list display and the single display. For example, when displaying a list, a frame that surrounds the entire image is displayed as in the portion (b) of FIG. 14, and when displaying alone, a figure that surrounds the portion of the attention area 802 as in the portion (b) of FIG. 8 and FIG. Or a frame can be displayed.
  • This display mode is displayed as shown in part (b) of FIG. 14 because only the information “the area of interest is reflected” or “not reflected” is sufficient when the list is displayed, and when the single display is performed, the area of interest in the image is displayed. Since information on the position of is required, it is displayed as shown in part (b) of FIG.
  • the medical image analysis processing unit detects an attention area, which is an attention area, based on the feature amount of the pixels of the medical image,
  • the medical image analysis result acquisition unit is a medical image processing apparatus that acquires the analysis result of the medical image analysis processing unit.
  • the medical image analysis processing unit based on the feature amount of the pixels of the medical image, detects the presence or absence of a target of interest
  • the medical image analysis result acquisition unit is a medical image processing apparatus that acquires the analysis result of the medical image analysis processing unit.
  • the analysis result is a medical image processing apparatus which is either or both of an attention area which is an attention area included in a medical image and the presence or absence of an attention object.
  • the medical image processing apparatus in which the medical image is a white band light or a normal light image obtained by irradiating light in a plurality of wavelength bands as white band light.
  • Medical images are images obtained by irradiating light in a specific wavelength band, The medical image processing device in which the specific wavelength band is narrower than the white wavelength band.
  • the specific wavelength band includes a wavelength band of 390 nm or more and 450 nm or less or 530 nm or more and 550 nm or less, and the light of the specific wavelength band has a peak wavelength in the wavelength band of 390 nm or more and 450 nm or less or 530 nm or more and 550 nm or less.
  • Image processing device includes a wavelength band of 390 nm or more and 450 nm or less or 530 nm or more and 550 nm or less, and the light of the specific wavelength band has a peak wavelength in the wavelength band of 390 nm or more and 450 nm or less or 530 nm or more and 550 nm or less.
  • the specific wavelength band includes a wavelength band of 585 nm or more and 615 nm or less or 610 nm or more and 730 nm or less, and the light of the specific wavelength band has a peak wavelength within a wavelength band of 585 nm or more and 615 nm or less or 610 nm or more and 730 nm or less.
  • Image processing device includes a wavelength band of 585 nm or more and 615 nm or less or 610 nm or more and 730 nm or less, and the light of the specific wavelength band has a peak wavelength within a wavelength band of 585 nm or more and 615 nm or less or 610 nm or more and 730 nm or less.
  • the specific wavelength band includes a wavelength band having a different absorption coefficient between oxyhemoglobin and reduced hemoglobin, and the light of a specific wavelength band has a peak wavelength in a wavelength band having a different absorption coefficient between oxyhemoglobin and reduced hemoglobin.
  • the specific wavelength band includes 400 ⁇ 10 nm, 440 ⁇ 10 nm, 470 ⁇ 10 nm, or a wavelength band of 600 nm or more and 750 nm or less, and the light of the specific wavelength band is 400 ⁇ 10 nm, 440 ⁇ 10 nm, 470 ⁇ .
  • a medical image processing apparatus having a peak wavelength in a wavelength band of 10 nm or 600 nm or more and 750 nm or less.
  • Medical images are in-vivo images of the inside of a living body,
  • the in-vivo image is a medical image processing apparatus that has information on the fluorescence emitted by the fluorescent substance in the living body.
  • the fluorescence is a medical image processing apparatus obtained by irradiating the living body with excitation light having a peak of 390 to 470 nm.
  • Medical images are in-vivo images of the inside of a living body, The medical image processing device in which the specific wavelength band is the wavelength band of infrared light.
  • the specific wavelength band includes a wavelength band of 790 nm or more and 820 nm or less or 905 nm or more and 970 nm or less, and the light of the specific wavelength band has a peak wavelength in a wavelength band of 790 nm or more and 820 nm or less or 905 nm or more and 970 nm or less. Processing equipment.
  • the medical image acquisition unit acquires a special light image having information of a specific wavelength band based on a white band light or a normal light image obtained by irradiating light of a plurality of wavelength bands as the white band light. Equipped with an optical image acquisition unit, Medical image processing device where medical images are special light images.
  • a feature amount image generation unit for generating a feature amount image
  • a medical image processing device in which a medical image is a feature amount image.
  • Appendix 19 A medical image processing apparatus according to any one of appendices 1 to 18, An endoscope that irradiates at least one of light in a white wavelength band or light in a specific wavelength band to obtain an image, An endoscopic device provided with.
  • Appendix 20 A diagnostic support device comprising the medical image processing device according to any one of appendices 1 to 18.
  • Appendix 21 A medical service support apparatus comprising the medical image processing apparatus according to any one of appendices 1 to 18.
  • Endoscope system 100 endoscope scope 102 Hand control unit 104 Insert 106 universal cable 108 Light guide connector 112 Soft part 114 curved part 116 Tip hard part 116A Tip side end face 123 Illuminator 123A Lighting lens 123B Lighting lens 126 Forceps mouth 130 Photographic optical system 132 shooting lens 134 Image sensor 136 drive circuit 138 AFE 141 Air/water button 142 Suction button 143 Function button 144 shooting button 170
  • Light guide 200 processors 202 Image input controller 204 image processing unit 205 communication control unit 206 Video output section 207 Recording section 208 Operation part 209 Voice processing unit 209A speaker 210 CPU 211 ROM 212 RAM 220 Medical Image Acquisition Department 222 Information acquisition unit 224A detector 224B classifier 224C measuring instrument 226 Mode setting unit 228 Display control unit 230 Recording control unit 232 Reception Department 234 First time information acquisition unit 235 Second time information acquisition unit 260 endoscopic image 262 recognition results H.264 display mode setting information 300 light source device 310 light source

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)

Abstract

L'objectif de la présente invention est de fournir un dispositif de traitement d'image médicale, un système d'endoscope et un procédé de traitement d'image médicale qui permettent de mettre en évidence les images médicales en fonction de leur acquisition en temps réel. Selon un mode de réalisation de la présente invention, le dispositif de traitement d'image médicale comprend : une unité d'acquisition d'image médicale pour acquérir une image médicale; une unité d'acquisition d'informations pour acquérir des informations concernant une zone d'intérêt dans l'image médicale; une unité de réglage de mode pour régler, sur la base des informations, la mise en évidence de la zone d'intérêt à un mode d'affichage selon que l'image médicale est une image en temps réel acquise en temps réel ou une image en temps non réel acquise en temps non réel; une unité de commande d'affichage amenant l'image médicale à s'afficher sur un dispositif d'affichage dans le mode d'affichage qui a été réglé; et une unité de commande d'enregistrement amenant l'image médicale et les informations à être associées et enregistrées dans un dispositif d'enregistrement.
PCT/JP2020/004224 2019-02-19 2020-02-05 Dispositif de traitement d'image médicale, système d'endoscope et procédé de traitement d'image médicale WO2020170809A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2021501830A JPWO2020170809A1 (ja) 2019-02-19 2020-02-05 医療画像処理装置、内視鏡システム、及び医療画像処理方法
JP2022200252A JP2023026480A (ja) 2019-02-19 2022-12-15 医療画像処理装置、内視鏡システム、及び医療画像処理装置の作動方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019027514 2019-02-19
JP2019-027514 2019-02-19

Publications (1)

Publication Number Publication Date
WO2020170809A1 true WO2020170809A1 (fr) 2020-08-27

Family

ID=72144653

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/004224 WO2020170809A1 (fr) 2019-02-19 2020-02-05 Dispositif de traitement d'image médicale, système d'endoscope et procédé de traitement d'image médicale

Country Status (2)

Country Link
JP (2) JPWO2020170809A1 (fr)
WO (1) WO2020170809A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022202401A1 (fr) * 2021-03-22 2022-09-29 富士フイルム株式会社 Dispositif de traitement d'image médicale, système d'endoscope, procédé de traitement d'image médicale et programme de traitement d'image médicale
WO2023276017A1 (fr) * 2021-06-30 2023-01-05 オリンパスメディカルシステムズ株式会社 Dispositif de traitement d'image, système d'endoscope et procédé de traitement d'image
WO2023090091A1 (fr) * 2021-11-19 2023-05-25 富士フイルム株式会社 Dispositif de traitement d'image et système d'endoscope

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003093339A (ja) * 1991-03-11 2003-04-02 Olympus Optical Co Ltd 画像処理装置
WO2009008125A1 (fr) * 2007-07-12 2009-01-15 Olympus Medical Systems Corp. Dispositif de traitement d'image, son procédé de fonctionnement et son programme
JP2010172673A (ja) * 2009-02-02 2010-08-12 Fujifilm Corp 内視鏡システム、内視鏡用プロセッサ装置、並びに内視鏡検査支援方法
WO2018020558A1 (fr) * 2016-07-25 2018-02-01 オリンパス株式会社 Dispositif, procédé et programme de traitement d'image
WO2018221033A1 (fr) * 2017-06-02 2018-12-06 富士フイルム株式会社 Dispositif de traitement des images médicales, système d'endoscope, dispositif d'aide au diagnostic et dispositif d'aide au travail médical

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2594627B2 (ja) * 1988-02-26 1997-03-26 オリンパス光学工業株式会社 電子内視鏡装置
WO2015105951A1 (fr) * 2014-01-08 2015-07-16 Board Of Regents Of The University Of Texas System Système et procédé permettant une imagerie par fluorescence intra-opératoire dans une lumière ambiante
JP6389136B2 (ja) * 2015-03-30 2018-09-12 富士フイルム株式会社 内視鏡撮影部位特定装置、プログラム
WO2017057574A1 (fr) * 2015-09-29 2017-04-06 富士フイルム株式会社 Appareil de traitement d'image, système d'endoscope, et procédé de traitement d'image
JP6525918B2 (ja) * 2016-04-20 2019-06-05 富士フイルム株式会社 内視鏡システム、画像処理装置、及び画像処理装置の作動方法
CN110868907B (zh) * 2017-04-28 2022-05-17 奥林巴斯株式会社 内窥镜诊断辅助系统、存储介质和内窥镜诊断辅助方法
US20200129042A1 (en) * 2017-05-25 2020-04-30 Nec Corporation Information processing apparatus, control method, and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003093339A (ja) * 1991-03-11 2003-04-02 Olympus Optical Co Ltd 画像処理装置
WO2009008125A1 (fr) * 2007-07-12 2009-01-15 Olympus Medical Systems Corp. Dispositif de traitement d'image, son procédé de fonctionnement et son programme
JP2010172673A (ja) * 2009-02-02 2010-08-12 Fujifilm Corp 内視鏡システム、内視鏡用プロセッサ装置、並びに内視鏡検査支援方法
WO2018020558A1 (fr) * 2016-07-25 2018-02-01 オリンパス株式会社 Dispositif, procédé et programme de traitement d'image
WO2018221033A1 (fr) * 2017-06-02 2018-12-06 富士フイルム株式会社 Dispositif de traitement des images médicales, système d'endoscope, dispositif d'aide au diagnostic et dispositif d'aide au travail médical

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022202401A1 (fr) * 2021-03-22 2022-09-29 富士フイルム株式会社 Dispositif de traitement d'image médicale, système d'endoscope, procédé de traitement d'image médicale et programme de traitement d'image médicale
WO2023276017A1 (fr) * 2021-06-30 2023-01-05 オリンパスメディカルシステムズ株式会社 Dispositif de traitement d'image, système d'endoscope et procédé de traitement d'image
WO2023090091A1 (fr) * 2021-11-19 2023-05-25 富士フイルム株式会社 Dispositif de traitement d'image et système d'endoscope

Also Published As

Publication number Publication date
JP2023026480A (ja) 2023-02-24
JPWO2020170809A1 (ja) 2021-12-02

Similar Documents

Publication Publication Date Title
JP7346285B2 (ja) 医療画像処理装置、内視鏡システム、医療画像処理装置の作動方法及びプログラム
JP7252970B2 (ja) 医用画像処理装置、内視鏡システム、及び医用画像処理装置の作動方法
WO2020162275A1 (fr) Dispositif de traitement d'image médicale, système d'endoscope et procédé de traitement d'image médicale
JP7048732B2 (ja) 画像処理装置、内視鏡システム、及び画像処理方法
WO2020170809A1 (fr) Dispositif de traitement d'image médicale, système d'endoscope et procédé de traitement d'image médicale
JP7289296B2 (ja) 画像処理装置、内視鏡システム及び画像処理装置の作動方法
JP6941233B2 (ja) 画像処理装置、内視鏡システム、及び画像処理方法
JP7062068B2 (ja) 画像処理方法及び画像処理装置
JP2020069300A (ja) 医療診断支援装置、内視鏡システム、及び医療診断支援方法
JP7146925B2 (ja) 医用画像処理装置及び内視鏡システム並びに医用画像処理装置の作動方法
US20220151462A1 (en) Image diagnosis assistance apparatus, endoscope system, image diagnosis assistance method , and image diagnosis assistance program
JP7374280B2 (ja) 内視鏡装置、内視鏡プロセッサ、及び内視鏡装置の作動方法
WO2019167623A1 (fr) Appareil de traitement d'image, système d'endoscope et procédé de traitement d'image
WO2021157487A1 (fr) Dispositif de traitement d'image médicale, système d'endoscope, procédé de traitement d'image médicale et programme
US20210082568A1 (en) Medical image processing device, processor device, endoscope system, medical image processing method, and program
WO2022181748A1 (fr) Dispositif de traitement d'image médicale, système d'endoscope, procédé de traitement d'image médicale, et programme de traitement d'image médicale
WO2022186109A1 (fr) Dispositif de traitement d'image médicale, système d'endoscope, procédé de traitement d'image médicale et programme de traitement d'image médicale
WO2021029293A1 (fr) Dispositif de traitement d'images médicales, système d'endoscope et procédé de traitement d'images médicales
WO2021153471A1 (fr) Dispositif de traitement d'image médicale, procédé de traitement d'image médicale et programme

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20759311

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021501830

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20759311

Country of ref document: EP

Kind code of ref document: A1